Apr 22 18:35:03.960254 ip-10-0-139-10 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:35:03.960269 ip-10-0-139-10 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:35:03.960279 ip-10-0-139-10 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:35:03.960493 ip-10-0-139-10 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:35:14.009667 ip-10-0-139-10 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:35:14.009686 ip-10-0-139-10 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot c795c9999ad64ce3b6275c5a88be1614 -- Apr 22 18:37:44.247008 ip-10-0-139-10 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:37:44.669461 ip-10-0-139-10 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:44.669461 ip-10-0-139-10 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:37:44.669461 ip-10-0-139-10 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:44.669461 ip-10-0-139-10 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:37:44.669461 ip-10-0-139-10 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:37:44.670285 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.670205 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:37:44.678647 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678622 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:44.678647 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678641 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:44.678647 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678647 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:44.678647 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678651 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:44.678647 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678655 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678660 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678664 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678667 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678671 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678675 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678679 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678683 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678688 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678694 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678699 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678703 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678707 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678711 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678731 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678735 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678749 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678753 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678757 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678761 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:44.678975 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678764 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678768 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678771 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678776 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678779 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678783 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678788 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678792 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678795 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678800 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678803 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678808 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678812 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678816 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678821 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678826 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678840 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678849 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678855 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678859 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:44.679790 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678863 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678869 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678875 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678879 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678884 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678889 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678893 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678897 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678902 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678906 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678911 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678914 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678919 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678923 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678927 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678932 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678938 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678954 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678959 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:44.680496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678964 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678968 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678972 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678976 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678980 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678984 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678988 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678992 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.678997 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679001 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679005 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679010 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679014 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679018 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679022 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679027 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679032 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679036 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679041 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679046 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:44.680992 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679050 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679054 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679059 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679733 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679744 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679750 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679755 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679760 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679764 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679769 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679775 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679781 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679786 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679790 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679794 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679798 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679802 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679806 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679810 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:44.681504 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679815 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679819 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679823 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679827 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679831 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679835 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679839 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679843 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679849 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679853 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679857 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679861 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679865 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679870 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679874 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679879 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679883 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679888 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679892 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679896 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:44.681995 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679902 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679906 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679910 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679914 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679918 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679923 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679927 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679931 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679935 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679940 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679944 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679948 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679953 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679957 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679962 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679966 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679970 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679974 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679979 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679983 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:44.682496 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679987 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679992 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.679997 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680001 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680005 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680010 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680016 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680020 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680025 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680029 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680033 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680037 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680041 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680045 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680049 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680056 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680062 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680066 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680071 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:44.683379 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680076 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680080 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680085 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680089 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680093 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680098 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680102 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680106 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680110 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680114 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.680118 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682517 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682534 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682544 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682550 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682560 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682566 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682573 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682580 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682586 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682591 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:37:44.684117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682596 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682602 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682607 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682612 2577 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682616 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682621 2577 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682626 2577 flags.go:64] FLAG: --cloud-config="" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682631 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682636 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682643 2577 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682648 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682654 2577 flags.go:64] FLAG: --config-dir="" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682659 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682664 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682671 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682675 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682680 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682686 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682691 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682696 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682701 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682706 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682710 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682733 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682738 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:37:44.684627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682743 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682747 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682753 2577 flags.go:64] FLAG: --enable-server="true" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682758 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682765 2577 flags.go:64] FLAG: --event-burst="100" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682770 2577 flags.go:64] FLAG: --event-qps="50" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682775 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682780 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682785 2577 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682791 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682796 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682801 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682806 2577 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682811 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682816 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682821 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682825 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682830 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682835 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682840 2577 flags.go:64] FLAG: --feature-gates="" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682846 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682851 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682856 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682861 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682866 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:37:44.685339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682871 2577 flags.go:64] FLAG: --help="false" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682876 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-139-10.ec2.internal" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682881 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682886 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682891 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682896 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682902 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682907 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682911 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682916 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682923 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682928 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682933 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682938 2577 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682943 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682956 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682961 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682966 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682970 2577 flags.go:64] FLAG: --lock-file="" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682975 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682980 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682985 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682994 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:37:44.686184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.682999 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683003 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683008 2577 flags.go:64] FLAG: --logging-format="text" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683014 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683019 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683024 2577 flags.go:64] FLAG: --manifest-url="" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683028 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683035 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683040 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683047 2577 flags.go:64] FLAG: --max-pods="110" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683051 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683057 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683061 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683066 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683071 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683076 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683081 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683094 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683100 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683104 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683110 2577 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683115 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683124 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683128 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:37:44.686777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683133 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683138 2577 flags.go:64] FLAG: --port="10250" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683143 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683148 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03e6e461d77d7daa1" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683153 2577 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683158 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683163 2577 flags.go:64] FLAG: --register-node="true" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683168 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683173 2577 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683179 2577 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683184 2577 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683188 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683193 2577 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683199 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683203 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683208 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683213 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683218 2577 flags.go:64] FLAG: --runonce="false" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683223 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683228 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683233 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683238 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683245 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683250 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683255 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683261 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:37:44.687484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683266 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683271 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683275 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683281 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683286 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683291 2577 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683295 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683305 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683310 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683315 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683321 2577 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683326 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683330 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683335 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683340 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683345 2577 flags.go:64] FLAG: --v="2" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683351 2577 flags.go:64] FLAG: --version="false" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683358 2577 flags.go:64] FLAG: --vmodule="" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683364 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.683370 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683520 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683527 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683532 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683536 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:44.688194 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683540 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683545 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683549 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683553 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683559 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683564 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683568 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683572 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683576 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683581 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683585 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683589 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683595 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683599 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683603 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683607 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683611 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683615 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683619 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683623 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:44.688793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683628 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683633 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683637 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683641 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683645 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683649 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683653 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683657 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683662 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683670 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683674 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683678 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683682 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683687 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683691 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683695 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683701 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683705 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683710 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683733 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:44.689289 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683738 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683742 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683747 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683751 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683755 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683760 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683765 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683771 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683777 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683782 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683786 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683790 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683794 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683798 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683802 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683807 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683811 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683815 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683819 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:44.689798 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683823 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683827 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683833 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683837 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683842 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683846 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683850 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683854 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683858 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683865 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683869 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683873 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683877 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683881 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683885 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683889 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683895 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683899 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683904 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:44.690259 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683909 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:44.690823 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683913 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:44.690823 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683917 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:44.690823 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.683923 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:44.690823 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.684959 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:44.691823 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.691703 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:37:44.691862 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.691825 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:37:44.691893 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691873 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:44.691893 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691880 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:44.691893 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691883 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:44.691893 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691887 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:44.691893 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691890 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:44.691893 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691893 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:44.691893 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691896 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691899 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691902 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691904 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691907 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691910 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691913 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691915 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691918 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691921 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691923 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691926 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691928 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691931 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691934 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691937 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691939 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691941 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691944 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:44.692070 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691947 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691949 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691952 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691954 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691957 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691959 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691962 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691965 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691968 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691970 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691973 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691975 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691978 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691981 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691983 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691987 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691990 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691992 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691995 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:44.692534 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.691998 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692000 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692003 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692006 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692008 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692011 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692013 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692017 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692021 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692025 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692027 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692030 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692033 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692035 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692038 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692040 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692043 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692046 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692049 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692051 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:44.693000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692055 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692057 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692060 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692063 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692065 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692068 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692071 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692074 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692077 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692080 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692083 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692086 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692089 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692091 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692094 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692096 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692099 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692103 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692106 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692109 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:44.693488 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692111 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692114 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.692119 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692216 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692221 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692223 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692226 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692229 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692232 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692234 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692237 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692241 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692246 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692249 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692252 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:37:44.694037 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692255 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692257 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692260 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692262 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692265 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692268 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692271 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692273 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692276 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692278 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692281 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692283 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692286 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692289 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692291 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692294 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692296 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692299 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692301 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692304 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:37:44.694408 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692306 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692309 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692311 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692314 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692316 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692319 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692322 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692324 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692327 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692329 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692332 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692335 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692338 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692341 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692343 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692345 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692348 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692351 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692353 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692356 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:37:44.694906 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692358 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692361 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692363 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692366 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692368 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692371 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692373 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692376 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692380 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692383 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692385 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692388 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692390 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692393 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692395 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692398 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692400 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692403 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692405 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:37:44.695398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692408 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692410 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692413 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692416 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692418 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692421 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692424 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692427 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692429 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692432 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692434 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692437 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692439 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692442 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:44.692444 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:37:44.695874 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.692449 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:37:44.696227 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.693168 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:37:44.696542 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.696529 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:37:44.697461 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.697451 2577 server.go:1019] "Starting client certificate rotation" Apr 22 18:37:44.697558 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.697542 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:37:44.697593 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.697583 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:37:44.722147 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.722131 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:37:44.727000 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.726984 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:37:44.745377 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.745361 2577 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:37:44.751171 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.751156 2577 log.go:25] "Validated CRI v1 image API" Apr 22 18:37:44.752284 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.752269 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:37:44.756369 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.756351 2577 fs.go:135] Filesystem UUIDs: map[3152858e-d8ee-48eb-a14a-1a591487b720:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e3c12954-453c-46e4-a1fc-951ba58f4867:/dev/nvme0n1p3] Apr 22 18:37:44.756427 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.756368 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:37:44.758073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.758056 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:37:44.762475 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.762373 2577 manager.go:217] Machine: {Timestamp:2026-04-22 18:37:44.760232536 +0000 UTC m=+0.398757649 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3119016 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec292a84a676d93bce499caa9a47d9fe SystemUUID:ec292a84-a676-d93b-ce49-9caa9a47d9fe BootID:c795c999-9ad6-4ce3-b627-5c5a88be1614 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:13:77:1f:d4:fd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:13:77:1f:d4:fd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:41:e0:51:a6:8a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:37:44.762475 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.762470 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:37:44.762591 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.762579 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:37:44.763803 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.763778 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:37:44.763939 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.763806 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-10.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:37:44.763981 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.763947 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:37:44.763981 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.763956 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:37:44.763981 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.763969 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:37:44.764800 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.764790 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:37:44.765767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.765758 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:37:44.765872 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.765863 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:37:44.768224 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.768214 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:37:44.768258 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.768233 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:37:44.768258 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.768245 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:37:44.768258 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.768253 2577 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:37:44.768348 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.768261 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:37:44.769534 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.769522 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:37:44.769575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.769539 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:37:44.772773 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.772755 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:37:44.774197 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.774180 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:37:44.776213 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776198 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:37:44.776286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776219 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:37:44.776286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776228 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:37:44.776286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776236 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:37:44.776286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776245 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:37:44.776286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776254 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:37:44.776286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776263 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:37:44.776286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776271 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:37:44.776286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776282 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:37:44.776548 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776291 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:37:44.776548 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776311 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:37:44.776548 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.776325 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:37:44.777360 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.777349 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:37:44.777419 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.777364 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:37:44.780749 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.780734 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:37:44.780854 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.780777 2577 server.go:1295] "Started kubelet" Apr 22 18:37:44.781139 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.781083 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:37:44.781291 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.781246 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:37:44.781332 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.781301 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-10.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:37:44.781332 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.781319 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:37:44.781685 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.781654 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-10.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:37:44.781871 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.781852 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:37:44.782051 ip-10-0-139-10 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:37:44.782495 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.782419 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:37:44.784158 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.784141 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:37:44.784893 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.784868 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nlp85" Apr 22 18:37:44.789817 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.789798 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:37:44.790378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.790364 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:37:44.791049 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.789854 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-10.ec2.internal.18a8c1bbaba9ed7f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-10.ec2.internal,UID:ip-10-0-139-10.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-10.ec2.internal,},FirstTimestamp:2026-04-22 18:37:44.780746111 +0000 UTC m=+0.419271249,LastTimestamp:2026-04-22 18:37:44.780746111 +0000 UTC m=+0.419271249,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-10.ec2.internal,}" Apr 22 18:37:44.791238 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791225 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:37:44.791277 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791241 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:37:44.791320 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791312 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:37:44.791365 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791351 2577 factory.go:55] Registering systemd factory Apr 22 18:37:44.791398 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791375 2577 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:37:44.791398 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.791384 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:44.791473 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791361 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:37:44.791473 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791420 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:37:44.791607 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791591 2577 factory.go:153] Registering CRI-O factory Apr 22 18:37:44.791705 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791670 2577 factory.go:223] Registration of the crio container factory successfully Apr 22 18:37:44.791778 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791739 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:37:44.791778 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791765 2577 factory.go:103] Registering Raw factory Apr 22 18:37:44.791866 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.791779 2577 manager.go:1196] Started watching for new ooms in manager Apr 22 18:37:44.792079 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.792054 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:37:44.792138 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.792131 2577 manager.go:319] Starting recovery of all containers Apr 22 18:37:44.793681 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.793663 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nlp85" Apr 22 18:37:44.797947 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.797907 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:37:44.798885 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.798861 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:44.801566 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.801544 2577 manager.go:324] Recovery completed Apr 22 18:37:44.803430 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.803412 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-10.ec2.internal\" not found" node="ip-10-0-139-10.ec2.internal" Apr 22 18:37:44.806265 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.806252 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:44.810265 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.810241 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:44.810322 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.810269 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:44.810322 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.810279 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:44.810798 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.810786 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:37:44.810798 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.810797 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:37:44.810885 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.810812 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:37:44.812876 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.812865 2577 policy_none.go:49] "None policy: Start" Apr 22 18:37:44.812920 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.812879 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:37:44.812920 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.812889 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:37:44.847302 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.847280 2577 manager.go:341] "Starting Device Plugin manager" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.847324 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.847333 2577 server.go:85] "Starting device plugin registration server" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.847541 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.847554 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.847630 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.847706 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.847730 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.848458 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:37:44.858764 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.848497 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:44.879243 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.879225 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:37:44.879316 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.879254 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:37:44.879316 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.879274 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:37:44.879316 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.879280 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:37:44.879449 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.879344 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:37:44.881586 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.881568 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:44.947752 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.947676 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:44.948690 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.948669 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:44.948783 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.948700 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:44.948783 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.948711 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:44.948783 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.948747 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-10.ec2.internal" Apr 22 18:37:44.957703 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.957688 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-10.ec2.internal" Apr 22 18:37:44.957800 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.957706 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-10.ec2.internal\": node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:44.977960 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.977940 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:44.980003 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.979988 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal"] Apr 22 18:37:44.980059 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.980044 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:44.980757 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.980741 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:44.980813 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.980772 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:44.980813 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.980782 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:44.982040 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982029 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:44.982202 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982189 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:44.982246 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982216 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:44.982654 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982638 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:44.982768 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982665 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:44.982768 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982679 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:44.982768 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982686 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:44.982768 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982702 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:44.982768 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.982711 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:44.983743 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.983728 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 18:37:44.983819 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.983759 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:37:44.984370 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.984347 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:37:44.984424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.984377 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:37:44.984424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:44.984387 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:37:44.998282 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:44.998265 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-10.ec2.internal\" not found" node="ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.002431 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.002416 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-10.ec2.internal\" not found" node="ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.078267 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.078241 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:45.092407 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.092390 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baab329e26e3fec548046283f03a6805-config\") pod \"kube-apiserver-proxy-ip-10-0-139-10.ec2.internal\" (UID: \"baab329e26e3fec548046283f03a6805\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.092469 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.092415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.092469 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.092437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.179164 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.179128 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:45.193482 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.193456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.193482 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.193484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.193629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.193503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baab329e26e3fec548046283f03a6805-config\") pod \"kube-apiserver-proxy-ip-10-0-139-10.ec2.internal\" (UID: \"baab329e26e3fec548046283f03a6805\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.193629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.193559 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.193629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.193569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baab329e26e3fec548046283f03a6805-config\") pod \"kube-apiserver-proxy-ip-10-0-139-10.ec2.internal\" (UID: \"baab329e26e3fec548046283f03a6805\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.193629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.193565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a85548356ac6e6ad9bedf610076abee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal\" (UID: \"9a85548356ac6e6ad9bedf610076abee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.279955 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.279881 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:45.300381 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.300363 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.304865 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.304846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.380617 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.380579 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:45.481112 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.481088 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:45.581697 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.581618 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-10.ec2.internal\" not found" Apr 22 18:37:45.610764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.610741 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:45.691499 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.691476 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.697774 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.697759 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:37:45.697869 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.697854 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:37:45.697930 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.697901 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:37:45.697974 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.697902 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:37:45.697974 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.697931 2577 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a4b7c5e78a2224c7fa9e809d68083e50-e41dd2c45f21093e.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.139.10:37222->52.203.42.218:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.697974 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.697957 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" Apr 22 18:37:45.717456 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.717433 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:37:45.769135 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.769111 2577 apiserver.go:52] "Watching apiserver" Apr 22 18:37:45.775963 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.775941 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:37:45.777753 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.777731 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-jpzr8","kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal","openshift-dns/node-resolver-642nb","openshift-image-registry/node-ca-648xg","openshift-multus/multus-pr4sw","openshift-multus/network-metrics-daemon-cqf5t","openshift-ovn-kubernetes/ovnkube-node-bm5qp","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j","openshift-cluster-node-tuning-operator/tuned-8qzs8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal","openshift-multus/multus-additional-cni-plugins-fcffm","openshift-network-diagnostics/network-check-target-jkhtl","openshift-network-operator/iptables-alerter-lpc4b"] Apr 22 18:37:45.780080 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.780060 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.781170 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.781152 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.781270 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.781253 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:37:45.782554 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.782534 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:37:45.783952 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.783759 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:37:45.783952 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.783778 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:37:45.783952 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.783785 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:37:45.783952 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.783869 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:37:45.784173 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.784163 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2wpvl\"" Apr 22 18:37:45.784279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.784185 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:37:45.784279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.784191 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:37:45.784377 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.784292 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:37:45.784584 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.784485 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pjkgq\"" Apr 22 18:37:45.784584 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.784547 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:37:45.784802 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.784781 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:37:45.784889 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.784828 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:37:45.785094 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.785079 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.785167 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.785135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.785167 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.785083 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dmt8l\"" Apr 22 18:37:45.786641 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.786623 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.787193 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.787178 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zn9tj\"" Apr 22 18:37:45.787285 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.787268 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:37:45.787887 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.787869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:45.787969 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.787945 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:37:45.788106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788090 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:37:45.788152 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788091 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:37:45.788407 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788388 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:37:45.788407 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788400 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:37:45.788529 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788408 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pchrd\"" Apr 22 18:37:45.788699 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788679 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:37:45.788781 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788746 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:37:45.788781 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788759 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:37:45.788781 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788749 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zjrcb\"" Apr 22 18:37:45.788924 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788813 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:37:45.788924 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.788920 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.789862 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.789845 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:37:45.790136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.790120 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.790879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.790861 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:37:45.790879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.790868 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:37:45.790997 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.790976 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kb59n\"" Apr 22 18:37:45.791204 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.791191 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:45.791264 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.791247 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:37:45.791887 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.791870 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:37:45.792098 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.792083 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zclmc\"" Apr 22 18:37:45.792171 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.792154 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:37:45.792231 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.792167 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.794387 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.794364 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:37:45.794470 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.794430 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:37:45.794615 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.794600 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r2fpj\"" Apr 22 18:37:45.794695 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.794680 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:37:45.796123 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796100 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:32:44 +0000 UTC" deadline="2027-12-30 14:05:16.809423066 +0000 UTC" Apr 22 18:37:45.796189 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796123 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14803h27m31.013303159s" Apr 22 18:37:45.796789 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgnbv\" (UniqueName: \"kubernetes.io/projected/b4af3b82-6cae-4792-8fe7-cf2daed473d1-kube-api-access-fgnbv\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.796847 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f6716c1-8454-4bb0-a15d-144eeaa62e20-host\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.796847 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796812 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpp79\" (UniqueName: \"kubernetes.io/projected/4f6716c1-8454-4bb0-a15d-144eeaa62e20-kube-api-access-zpp79\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.796847 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-systemd\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.796847 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796841 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.797029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796863 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.797029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796915 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-netns\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.797029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796948 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-hostroot\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.797029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.796977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-host-slash\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.797029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-node-log\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-run-ovn-kubernetes\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-registration-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.797199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797075 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/174b9358-0c72-4c5e-94ed-aeee18c176a2-tmp\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.797199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-run-netns\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.797199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f6716c1-8454-4bb0-a15d-144eeaa62e20-serviceca\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.797199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-multus-certs\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.797199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-run\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797214 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-var-lib-kubelet\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj9td\" (UniqueName: \"kubernetes.io/projected/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-kube-api-access-fj9td\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-system-cni-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-modprobe-d\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-kubernetes\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797327 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-var-lib-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfc4f516-12a3-4b4c-948f-d21348678585-ovn-node-metrics-cert\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/27440aa5-4698-4ea7-b7a8-ca0f7994d4e8-agent-certs\") pod \"konnectivity-agent-jpzr8\" (UID: \"27440aa5-4698-4ea7-b7a8-ca0f7994d4e8\") " pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:37:45.797430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-os-release\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797441 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-socket-dir-parent\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/556c4304-c27e-49e6-9289-7b8986ec176b-multus-daemon-config\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-sys\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-etc-selinux\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-etc-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797601 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-ovn\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-socket-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysconfig\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysctl-d\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797694 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-systemd-units\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/27440aa5-4698-4ea7-b7a8-ca0f7994d4e8-konnectivity-ca\") pod \"konnectivity-agent-jpzr8\" (UID: \"27440aa5-4698-4ea7-b7a8-ca0f7994d4e8\") " pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxmh5\" (UniqueName: \"kubernetes.io/projected/e3bffc40-492a-471a-83d2-c9bd203d82a8-kube-api-access-jxmh5\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grl7z\" (UniqueName: \"kubernetes.io/projected/728367cd-e906-4445-a992-1b5b0aebdce9-kube-api-access-grl7z\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.797848 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797780 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-cni-bin\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-conf-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-host\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797848 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-cni-netd\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-ovnkube-script-lib\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-sys-fs\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-kubelet\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-slash\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/556c4304-c27e-49e6-9289-7b8986ec176b-cni-binary-copy\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.797988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-env-overrides\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsx7r\" (UniqueName: \"kubernetes.io/projected/556c4304-c27e-49e6-9289-7b8986ec176b-kube-api-access-bsx7r\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysctl-conf\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-tuned\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-iptables-alerter-script\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f72f07da-c956-44a1-91e4-efb83a4ae9fc-tmp-dir\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.798462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-cni-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-systemd\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-lib-modules\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp6f5\" (UniqueName: \"kubernetes.io/projected/174b9358-0c72-4c5e-94ed-aeee18c176a2-kube-api-access-jp6f5\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-log-socket\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-kubelet\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5nt6\" (UniqueName: \"kubernetes.io/projected/bfc4f516-12a3-4b4c-948f-d21348678585-kube-api-access-v5nt6\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798396 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-system-cni-dir\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798433 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-cnibin\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-k8s-cni-cncf-io\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-cni-multus\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798517 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-cni-bin\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f72f07da-c956-44a1-91e4-efb83a4ae9fc-hosts-file\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-os-release\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-device-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.799010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798625 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-etc-kubernetes\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.799498 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-ovnkube-config\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.799498 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5wc\" (UniqueName: \"kubernetes.io/projected/f72f07da-c956-44a1-91e4-efb83a4ae9fc-kube-api-access-cz5wc\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.799498 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.798710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cnibin\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.801172 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.801157 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:37:45.825048 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.825030 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hpbmk" Apr 22 18:37:45.834027 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.833975 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hpbmk" Apr 22 18:37:45.883943 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:45.883917 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a85548356ac6e6ad9bedf610076abee.slice/crio-df607a7c6ceed43cc161db9bd07f3ef0223bb951a8393b67e68614f1e92bc975 WatchSource:0}: Error finding container df607a7c6ceed43cc161db9bd07f3ef0223bb951a8393b67e68614f1e92bc975: Status 404 returned error can't find the container with id df607a7c6ceed43cc161db9bd07f3ef0223bb951a8393b67e68614f1e92bc975 Apr 22 18:37:45.888860 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.888631 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:37:45.889763 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:45.889647 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaab329e26e3fec548046283f03a6805.slice/crio-d6cf837a29f1bacaeeb55ebc6d1d97d7db10a8f5cabd9152c93fd7547466583e WatchSource:0}: Error finding container d6cf837a29f1bacaeeb55ebc6d1d97d7db10a8f5cabd9152c93fd7547466583e: Status 404 returned error can't find the container with id d6cf837a29f1bacaeeb55ebc6d1d97d7db10a8f5cabd9152c93fd7547466583e Apr 22 18:37:45.892065 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.892046 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:37:45.899095 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-netns\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899189 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-hostroot\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899189 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-host-slash\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.899189 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-node-log\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899189 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-run-ovn-kubernetes\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899189 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-registration-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-node-log\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-netns\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/174b9358-0c72-4c5e-94ed-aeee18c176a2-tmp\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-hostroot\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-host-slash\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-run-ovn-kubernetes\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-registration-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-run-netns\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899308 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-run-netns\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f6716c1-8454-4bb0-a15d-144eeaa62e20-serviceca\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.899378 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-multus-certs\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-multus-certs\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-run\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-var-lib-kubelet\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fj9td\" (UniqueName: \"kubernetes.io/projected/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-kube-api-access-fj9td\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-run\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-var-lib-kubelet\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-system-cni-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-modprobe-d\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-system-cni-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-kubernetes\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899468 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-var-lib-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfc4f516-12a3-4b4c-948f-d21348678585-ovn-node-metrics-cert\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/27440aa5-4698-4ea7-b7a8-ca0f7994d4e8-agent-certs\") pod \"konnectivity-agent-jpzr8\" (UID: \"27440aa5-4698-4ea7-b7a8-ca0f7994d4e8\") " pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-modprobe-d\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-os-release\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.899931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-socket-dir-parent\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/556c4304-c27e-49e6-9289-7b8986ec176b-multus-daemon-config\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-var-lib-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-sys\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-kubernetes\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f6716c1-8454-4bb0-a15d-144eeaa62e20-serviceca\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-etc-selinux\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-etc-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-ovn\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-sys\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.899960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-socket-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-socket-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-etc-selinux\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.900750 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900327 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-etc-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900341 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-os-release\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-ovn\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-socket-dir-parent\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysconfig\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/556c4304-c27e-49e6-9289-7b8986ec176b-multus-daemon-config\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysctl-d\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-systemd-units\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysconfig\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/27440aa5-4698-4ea7-b7a8-ca0f7994d4e8-konnectivity-ca\") pod \"konnectivity-agent-jpzr8\" (UID: \"27440aa5-4698-4ea7-b7a8-ca0f7994d4e8\") " pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysctl-d\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900555 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-systemd-units\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxmh5\" (UniqueName: \"kubernetes.io/projected/e3bffc40-492a-471a-83d2-c9bd203d82a8-kube-api-access-jxmh5\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grl7z\" (UniqueName: \"kubernetes.io/projected/728367cd-e906-4445-a992-1b5b0aebdce9-kube-api-access-grl7z\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-cni-bin\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-conf-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-host\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.901562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-cni-netd\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-ovnkube-script-lib\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900761 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-sys-fs\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-kubelet\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-slash\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-host\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/556c4304-c27e-49e6-9289-7b8986ec176b-cni-binary-copy\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-env-overrides\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsx7r\" (UniqueName: \"kubernetes.io/projected/556c4304-c27e-49e6-9289-7b8986ec176b-kube-api-access-bsx7r\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.900992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysctl-conf\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-tuned\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-cni-bin\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-iptables-alerter-script\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/27440aa5-4698-4ea7-b7a8-ca0f7994d4e8-konnectivity-ca\") pod \"konnectivity-agent-jpzr8\" (UID: \"27440aa5-4698-4ea7-b7a8-ca0f7994d4e8\") " pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-conf-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f72f07da-c956-44a1-91e4-efb83a4ae9fc-tmp-dir\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.902386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-cni-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901145 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-multus-cni-dir\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-sysctl-conf\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-openvswitch\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-systemd\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-lib-modules\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901592 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-cni-netd\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp6f5\" (UniqueName: \"kubernetes.io/projected/174b9358-0c72-4c5e-94ed-aeee18c176a2-kube-api-access-jp6f5\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-log-socket\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-kubelet\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5nt6\" (UniqueName: \"kubernetes.io/projected/bfc4f516-12a3-4b4c-948f-d21348678585-kube-api-access-v5nt6\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-system-cni-dir\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/556c4304-c27e-49e6-9289-7b8986ec176b-cni-binary-copy\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-cnibin\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901879 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-systemd\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-k8s-cni-cncf-io\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-log-socket\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-cni-multus\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901950 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/174b9358-0c72-4c5e-94ed-aeee18c176a2-lib-modules\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-cni-bin\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f72f07da-c956-44a1-91e4-efb83a4ae9fc-hosts-file\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-env-overrides\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-system-cni-dir\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-run-k8s-cni-cncf-io\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.902068 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-os-release\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.902141 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs podName:e3bffc40-492a-471a-83d2-c9bd203d82a8 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:46.402115692 +0000 UTC m=+2.040640820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs") pod "network-metrics-daemon-cqf5t" (UID: "e3bffc40-492a-471a-83d2-c9bd203d82a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-device-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-etc-kubernetes\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902205 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-os-release\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-kubelet\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/174b9358-0c72-4c5e-94ed-aeee18c176a2-tmp\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-device-dir\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-slash\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.903940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902535 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfc4f516-12a3-4b4c-948f-d21348678585-ovn-node-metrics-cert\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902581 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-ovnkube-script-lib\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-host-cni-bin\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.901970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-cni-multus\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902734 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-cnibin\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-host-var-lib-kubelet\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f72f07da-c956-44a1-91e4-efb83a4ae9fc-hosts-file\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-ovnkube-config\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfc4f516-12a3-4b4c-948f-d21348678585-ovnkube-config\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5wc\" (UniqueName: \"kubernetes.io/projected/f72f07da-c956-44a1-91e4-efb83a4ae9fc-kube-api-access-cz5wc\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cnibin\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.902980 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgnbv\" (UniqueName: \"kubernetes.io/projected/b4af3b82-6cae-4792-8fe7-cf2daed473d1-kube-api-access-fgnbv\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f6716c1-8454-4bb0-a15d-144eeaa62e20-host\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpp79\" (UniqueName: \"kubernetes.io/projected/4f6716c1-8454-4bb0-a15d-144eeaa62e20-kube-api-access-zpp79\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-systemd\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903096 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/728367cd-e906-4445-a992-1b5b0aebdce9-sys-fs\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.904540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-iptables-alerter-script\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/27440aa5-4698-4ea7-b7a8-ca0f7994d4e8-agent-certs\") pod \"konnectivity-agent-jpzr8\" (UID: \"27440aa5-4698-4ea7-b7a8-ca0f7994d4e8\") " pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903182 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f72f07da-c956-44a1-91e4-efb83a4ae9fc-tmp-dir\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f6716c1-8454-4bb0-a15d-144eeaa62e20-host\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cnibin\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfc4f516-12a3-4b4c-948f-d21348678585-run-systemd\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/556c4304-c27e-49e6-9289-7b8986ec176b-etc-kubernetes\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/174b9358-0c72-4c5e-94ed-aeee18c176a2-etc-tuned\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.905148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.903698 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b4af3b82-6cae-4792-8fe7-cf2daed473d1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.912510 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.912492 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj9td\" (UniqueName: \"kubernetes.io/projected/247d22a1-a2b2-4af9-b8c2-0bfad6ca0664-kube-api-access-fj9td\") pod \"iptables-alerter-lpc4b\" (UID: \"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664\") " pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:45.913049 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.913035 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:45.913097 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.913054 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:45.913097 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.913064 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gd9ct for pod openshift-network-diagnostics/network-check-target-jkhtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:45.913182 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:45.913109 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct podName:c50ef4af-f64b-4608-b9e7-126d66048d98 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:46.41309592 +0000 UTC m=+2.051621039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gd9ct" (UniqueName: "kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct") pod "network-check-target-jkhtl" (UID: "c50ef4af-f64b-4608-b9e7-126d66048d98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:45.915190 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.915165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsx7r\" (UniqueName: \"kubernetes.io/projected/556c4304-c27e-49e6-9289-7b8986ec176b-kube-api-access-bsx7r\") pod \"multus-pr4sw\" (UID: \"556c4304-c27e-49e6-9289-7b8986ec176b\") " pod="openshift-multus/multus-pr4sw" Apr 22 18:37:45.918299 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.918235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5wc\" (UniqueName: \"kubernetes.io/projected/f72f07da-c956-44a1-91e4-efb83a4ae9fc-kube-api-access-cz5wc\") pod \"node-resolver-642nb\" (UID: \"f72f07da-c956-44a1-91e4-efb83a4ae9fc\") " pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:45.918299 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.918265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgnbv\" (UniqueName: \"kubernetes.io/projected/b4af3b82-6cae-4792-8fe7-cf2daed473d1-kube-api-access-fgnbv\") pod \"multus-additional-cni-plugins-fcffm\" (UID: \"b4af3b82-6cae-4792-8fe7-cf2daed473d1\") " pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:45.918447 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.918318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5nt6\" (UniqueName: \"kubernetes.io/projected/bfc4f516-12a3-4b4c-948f-d21348678585-kube-api-access-v5nt6\") pod \"ovnkube-node-bm5qp\" (UID: \"bfc4f516-12a3-4b4c-948f-d21348678585\") " pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:45.918447 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.918357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp6f5\" (UniqueName: \"kubernetes.io/projected/174b9358-0c72-4c5e-94ed-aeee18c176a2-kube-api-access-jp6f5\") pod \"tuned-8qzs8\" (UID: \"174b9358-0c72-4c5e-94ed-aeee18c176a2\") " pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:45.918691 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.918670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxmh5\" (UniqueName: \"kubernetes.io/projected/e3bffc40-492a-471a-83d2-c9bd203d82a8-kube-api-access-jxmh5\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:45.919548 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.919531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grl7z\" (UniqueName: \"kubernetes.io/projected/728367cd-e906-4445-a992-1b5b0aebdce9-kube-api-access-grl7z\") pod \"aws-ebs-csi-driver-node-b8b6j\" (UID: \"728367cd-e906-4445-a992-1b5b0aebdce9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:45.919610 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:45.919582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpp79\" (UniqueName: \"kubernetes.io/projected/4f6716c1-8454-4bb0-a15d-144eeaa62e20-kube-api-access-zpp79\") pod \"node-ca-648xg\" (UID: \"4f6716c1-8454-4bb0-a15d-144eeaa62e20\") " pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:46.006068 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.006048 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:46.112130 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.112055 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:37:46.117623 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:46.117602 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfc4f516_12a3_4b4c_948f_d21348678585.slice/crio-33c3f323e3c7743f691df3742f59929afd5019b20eb72ef40a67d4ac596d7479 WatchSource:0}: Error finding container 33c3f323e3c7743f691df3742f59929afd5019b20eb72ef40a67d4ac596d7479: Status 404 returned error can't find the container with id 33c3f323e3c7743f691df3742f59929afd5019b20eb72ef40a67d4ac596d7479 Apr 22 18:37:46.120321 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.120305 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-648xg" Apr 22 18:37:46.126032 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:46.126015 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f6716c1_8454_4bb0_a15d_144eeaa62e20.slice/crio-f0f79309e2d80157d9b85b49dac0ba6cb32cb1e5f441c54bd28abe9b0ee5d3fd WatchSource:0}: Error finding container f0f79309e2d80157d9b85b49dac0ba6cb32cb1e5f441c54bd28abe9b0ee5d3fd: Status 404 returned error can't find the container with id f0f79309e2d80157d9b85b49dac0ba6cb32cb1e5f441c54bd28abe9b0ee5d3fd Apr 22 18:37:46.139531 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.139516 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:37:46.143971 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.143957 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" Apr 22 18:37:46.144894 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:46.144871 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27440aa5_4698_4ea7_b7a8_ca0f7994d4e8.slice/crio-d58ffd8c7ac869225e713a7bb4c1167e04dfbe10ea1e670ffa45ea697a2c190f WatchSource:0}: Error finding container d58ffd8c7ac869225e713a7bb4c1167e04dfbe10ea1e670ffa45ea697a2c190f: Status 404 returned error can't find the container with id d58ffd8c7ac869225e713a7bb4c1167e04dfbe10ea1e670ffa45ea697a2c190f Apr 22 18:37:46.149178 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:46.149156 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod728367cd_e906_4445_a992_1b5b0aebdce9.slice/crio-0d57731a494a000db54015c2a86fdc8802ee5772bb26552df6777590fb528887 WatchSource:0}: Error finding container 0d57731a494a000db54015c2a86fdc8802ee5772bb26552df6777590fb528887: Status 404 returned error can't find the container with id 0d57731a494a000db54015c2a86fdc8802ee5772bb26552df6777590fb528887 Apr 22 18:37:46.168450 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.168433 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-642nb" Apr 22 18:37:46.175247 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:46.175226 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf72f07da_c956_44a1_91e4_efb83a4ae9fc.slice/crio-2268860cc3aa89163ac8a77a32e257669f65ed655fd1c4ffd40acefdde58d6ba WatchSource:0}: Error finding container 2268860cc3aa89163ac8a77a32e257669f65ed655fd1c4ffd40acefdde58d6ba: Status 404 returned error can't find the container with id 2268860cc3aa89163ac8a77a32e257669f65ed655fd1c4ffd40acefdde58d6ba Apr 22 18:37:46.188410 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.188386 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pr4sw" Apr 22 18:37:46.193930 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:46.193912 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod556c4304_c27e_49e6_9289_7b8986ec176b.slice/crio-42862c3e4b205142a300dda930ad40aa21dd024f4efb2938e2b3d08ee2a7159d WatchSource:0}: Error finding container 42862c3e4b205142a300dda930ad40aa21dd024f4efb2938e2b3d08ee2a7159d: Status 404 returned error can't find the container with id 42862c3e4b205142a300dda930ad40aa21dd024f4efb2938e2b3d08ee2a7159d Apr 22 18:37:46.194566 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.194547 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" Apr 22 18:37:46.199532 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.199516 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fcffm" Apr 22 18:37:46.205412 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:46.205395 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4af3b82_6cae_4792_8fe7_cf2daed473d1.slice/crio-0c300072d1a790ada98d3561debf5e082d2baa118a7d967aa7b43f5ab4b69122 WatchSource:0}: Error finding container 0c300072d1a790ada98d3561debf5e082d2baa118a7d967aa7b43f5ab4b69122: Status 404 returned error can't find the container with id 0c300072d1a790ada98d3561debf5e082d2baa118a7d967aa7b43f5ab4b69122 Apr 22 18:37:46.208495 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.208480 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lpc4b" Apr 22 18:37:46.213686 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:37:46.213666 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod247d22a1_a2b2_4af9_b8c2_0bfad6ca0664.slice/crio-0a80c6568b42cf11e4e07a1df593f0c011f1670af724ce5a1fe0950fe62e4574 WatchSource:0}: Error finding container 0a80c6568b42cf11e4e07a1df593f0c011f1670af724ce5a1fe0950fe62e4574: Status 404 returned error can't find the container with id 0a80c6568b42cf11e4e07a1df593f0c011f1670af724ce5a1fe0950fe62e4574 Apr 22 18:37:46.406757 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.406666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:46.406914 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:46.406809 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:46.406914 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:46.406867 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs podName:e3bffc40-492a-471a-83d2-c9bd203d82a8 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:47.406850185 +0000 UTC m=+3.045375301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs") pod "network-metrics-daemon-cqf5t" (UID: "e3bffc40-492a-471a-83d2-c9bd203d82a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:46.507466 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.507395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:46.507637 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:46.507576 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:46.507637 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:46.507597 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:46.507637 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:46.507607 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gd9ct for pod openshift-network-diagnostics/network-check-target-jkhtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:46.507810 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:46.507659 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct podName:c50ef4af-f64b-4608-b9e7-126d66048d98 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:47.507641369 +0000 UTC m=+3.146166483 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gd9ct" (UniqueName: "kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct") pod "network-check-target-jkhtl" (UID: "c50ef4af-f64b-4608-b9e7-126d66048d98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:46.523126 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.523095 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:46.777121 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.776960 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:46.835579 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.835534 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:32:45 +0000 UTC" deadline="2027-09-25 22:31:38.389028882 +0000 UTC" Apr 22 18:37:46.835579 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.835572 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12507h53m51.553460683s" Apr 22 18:37:46.920524 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.920438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fcffm" event={"ID":"b4af3b82-6cae-4792-8fe7-cf2daed473d1","Type":"ContainerStarted","Data":"0c300072d1a790ada98d3561debf5e082d2baa118a7d967aa7b43f5ab4b69122"} Apr 22 18:37:46.928758 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.928709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr4sw" event={"ID":"556c4304-c27e-49e6-9289-7b8986ec176b","Type":"ContainerStarted","Data":"42862c3e4b205142a300dda930ad40aa21dd024f4efb2938e2b3d08ee2a7159d"} Apr 22 18:37:46.937359 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.937326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-642nb" event={"ID":"f72f07da-c956-44a1-91e4-efb83a4ae9fc","Type":"ContainerStarted","Data":"2268860cc3aa89163ac8a77a32e257669f65ed655fd1c4ffd40acefdde58d6ba"} Apr 22 18:37:46.951546 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.951514 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jpzr8" event={"ID":"27440aa5-4698-4ea7-b7a8-ca0f7994d4e8","Type":"ContainerStarted","Data":"d58ffd8c7ac869225e713a7bb4c1167e04dfbe10ea1e670ffa45ea697a2c190f"} Apr 22 18:37:46.971978 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.971953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"33c3f323e3c7743f691df3742f59929afd5019b20eb72ef40a67d4ac596d7479"} Apr 22 18:37:46.982892 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.982845 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" event={"ID":"baab329e26e3fec548046283f03a6805","Type":"ContainerStarted","Data":"d6cf837a29f1bacaeeb55ebc6d1d97d7db10a8f5cabd9152c93fd7547466583e"} Apr 22 18:37:46.995946 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.995879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" event={"ID":"9a85548356ac6e6ad9bedf610076abee","Type":"ContainerStarted","Data":"df607a7c6ceed43cc161db9bd07f3ef0223bb951a8393b67e68614f1e92bc975"} Apr 22 18:37:46.999336 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:46.999314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lpc4b" event={"ID":"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664","Type":"ContainerStarted","Data":"0a80c6568b42cf11e4e07a1df593f0c011f1670af724ce5a1fe0950fe62e4574"} Apr 22 18:37:47.018079 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.018044 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" event={"ID":"174b9358-0c72-4c5e-94ed-aeee18c176a2","Type":"ContainerStarted","Data":"65e3047ad0b01cd29cc598d18f075aad586a4f77544f9fe713e855679ccc254f"} Apr 22 18:37:47.035737 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.035660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" event={"ID":"728367cd-e906-4445-a992-1b5b0aebdce9","Type":"ContainerStarted","Data":"0d57731a494a000db54015c2a86fdc8802ee5772bb26552df6777590fb528887"} Apr 22 18:37:47.052603 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.052529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-648xg" event={"ID":"4f6716c1-8454-4bb0-a15d-144eeaa62e20","Type":"ContainerStarted","Data":"f0f79309e2d80157d9b85b49dac0ba6cb32cb1e5f441c54bd28abe9b0ee5d3fd"} Apr 22 18:37:47.416253 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.416168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:47.416454 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:47.416316 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:47.416454 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:47.416377 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs podName:e3bffc40-492a-471a-83d2-c9bd203d82a8 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:49.416357237 +0000 UTC m=+5.054882359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs") pod "network-metrics-daemon-cqf5t" (UID: "e3bffc40-492a-471a-83d2-c9bd203d82a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:47.517675 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.517003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:47.517675 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:47.517209 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:47.517675 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:47.517229 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:47.517675 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:47.517243 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gd9ct for pod openshift-network-diagnostics/network-check-target-jkhtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:47.517675 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:47.517297 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct podName:c50ef4af-f64b-4608-b9e7-126d66048d98 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:49.51727971 +0000 UTC m=+5.155804815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gd9ct" (UniqueName: "kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct") pod "network-check-target-jkhtl" (UID: "c50ef4af-f64b-4608-b9e7-126d66048d98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:47.836637 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.836528 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:32:45 +0000 UTC" deadline="2027-11-08 07:44:29.174580743 +0000 UTC" Apr 22 18:37:47.836637 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.836564 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13549h6m41.338020053s" Apr 22 18:37:47.880624 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.880136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:47.880624 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:47.880278 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:37:47.880624 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:47.880420 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:47.880624 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:47.880539 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:37:48.801575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:48.801544 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:37:49.431593 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:49.431558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:49.432051 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:49.431775 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:49.432051 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:49.431879 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs podName:e3bffc40-492a-471a-83d2-c9bd203d82a8 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:53.431828519 +0000 UTC m=+9.070353625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs") pod "network-metrics-daemon-cqf5t" (UID: "e3bffc40-492a-471a-83d2-c9bd203d82a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:49.532707 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:49.532670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:49.532900 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:49.532869 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:49.532900 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:49.532897 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:49.533029 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:49.532910 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gd9ct for pod openshift-network-diagnostics/network-check-target-jkhtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:49.533029 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:49.532971 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct podName:c50ef4af-f64b-4608-b9e7-126d66048d98 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:53.532953918 +0000 UTC m=+9.171479018 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gd9ct" (UniqueName: "kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct") pod "network-check-target-jkhtl" (UID: "c50ef4af-f64b-4608-b9e7-126d66048d98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:49.879633 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:49.879560 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:49.879798 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:49.879682 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:37:49.879873 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:49.879852 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:49.879988 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:49.879968 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:37:51.696405 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.696308 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fgvjk"] Apr 22 18:37:51.703611 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.703539 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.703737 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:51.703618 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:37:51.750993 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.750960 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d025095-cc31-4ade-becf-5c56f458a510-dbus\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.751158 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.751061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.751158 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.751132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d025095-cc31-4ade-becf-5c56f458a510-kubelet-config\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.851880 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.851841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.852071 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.851900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d025095-cc31-4ade-becf-5c56f458a510-kubelet-config\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.852071 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.851949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d025095-cc31-4ade-becf-5c56f458a510-dbus\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.852071 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:51.852029 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:51.852071 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.852047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d025095-cc31-4ade-becf-5c56f458a510-dbus\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.852281 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:51.852104 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret podName:7d025095-cc31-4ade-becf-5c56f458a510 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:52.352083942 +0000 UTC m=+7.990609060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret") pod "global-pull-secret-syncer-fgvjk" (UID: "7d025095-cc31-4ade-becf-5c56f458a510") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:51.852281 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.852116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d025095-cc31-4ade-becf-5c56f458a510-kubelet-config\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:51.879965 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.879929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:51.880121 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:51.880074 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:37:51.880589 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:51.880469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:51.880589 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:51.880558 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:37:52.356635 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:52.356052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:52.356635 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:52.356248 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:52.356635 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:52.356308 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret podName:7d025095-cc31-4ade-becf-5c56f458a510 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:53.356290082 +0000 UTC m=+8.994815185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret") pod "global-pull-secret-syncer-fgvjk" (UID: "7d025095-cc31-4ade-becf-5c56f458a510") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:53.364277 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:53.364236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:53.364779 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.364400 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:53.364779 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.364484 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret podName:7d025095-cc31-4ade-becf-5c56f458a510 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:55.364457508 +0000 UTC m=+11.002982623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret") pod "global-pull-secret-syncer-fgvjk" (UID: "7d025095-cc31-4ade-becf-5c56f458a510") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:53.464815 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:53.464781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:53.464971 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.464933 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:53.465042 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.465009 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs podName:e3bffc40-492a-471a-83d2-c9bd203d82a8 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:01.46498714 +0000 UTC m=+17.103512245 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs") pod "network-metrics-daemon-cqf5t" (UID: "e3bffc40-492a-471a-83d2-c9bd203d82a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:37:53.565729 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:53.565674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:53.565899 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.565840 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:37:53.565899 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.565860 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:37:53.565899 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.565873 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gd9ct for pod openshift-network-diagnostics/network-check-target-jkhtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:53.566044 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.565932 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct podName:c50ef4af-f64b-4608-b9e7-126d66048d98 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:01.565914888 +0000 UTC m=+17.204439994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gd9ct" (UniqueName: "kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct") pod "network-check-target-jkhtl" (UID: "c50ef4af-f64b-4608-b9e7-126d66048d98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:37:53.880106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:53.880020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:53.880106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:53.880058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:53.880329 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.880162 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:37:53.880513 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.880471 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:37:53.880513 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:53.880020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:53.880649 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:53.880596 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:37:55.379450 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:55.379415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:55.379902 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:55.379574 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:55.379902 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:55.379655 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret podName:7d025095-cc31-4ade-becf-5c56f458a510 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:59.379635281 +0000 UTC m=+15.018160382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret") pod "global-pull-secret-syncer-fgvjk" (UID: "7d025095-cc31-4ade-becf-5c56f458a510") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:55.880415 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:55.880385 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:55.880599 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:55.880386 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:55.880599 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:55.880576 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:37:55.880599 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:55.880480 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:37:55.880599 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:55.880387 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:55.880817 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:55.880678 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:37:57.879885 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:57.879850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:57.879885 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:57.879874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:57.880329 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:57.879966 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:37:57.880329 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:57.880006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:57.880329 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:57.880093 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:37:57.880329 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:57.880172 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:37:59.405987 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:59.405926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:59.406483 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:59.406106 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:59.406483 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:59.406193 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret podName:7d025095-cc31-4ade-becf-5c56f458a510 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:07.406170144 +0000 UTC m=+23.044695261 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret") pod "global-pull-secret-syncer-fgvjk" (UID: "7d025095-cc31-4ade-becf-5c56f458a510") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:37:59.879765 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:59.879671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:37:59.879765 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:59.879671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:37:59.879765 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:37:59.879671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:37:59.880004 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:59.879813 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:37:59.880004 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:59.879901 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:37:59.880004 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:37:59.879973 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:01.523373 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:01.523336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:01.523791 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.523531 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:38:01.523791 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.523633 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs podName:e3bffc40-492a-471a-83d2-c9bd203d82a8 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:17.523612926 +0000 UTC m=+33.162138042 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs") pod "network-metrics-daemon-cqf5t" (UID: "e3bffc40-492a-471a-83d2-c9bd203d82a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:38:01.624482 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:01.624451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:01.624657 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.624623 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:38:01.624657 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.624651 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:38:01.624789 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.624666 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gd9ct for pod openshift-network-diagnostics/network-check-target-jkhtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:38:01.624789 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.624740 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct podName:c50ef4af-f64b-4608-b9e7-126d66048d98 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:17.624709249 +0000 UTC m=+33.263234348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gd9ct" (UniqueName: "kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct") pod "network-check-target-jkhtl" (UID: "c50ef4af-f64b-4608-b9e7-126d66048d98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:38:01.880063 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:01.879982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:01.880063 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:01.880026 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:01.880351 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:01.879982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:01.880351 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.880088 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:01.880351 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.880169 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:01.880351 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:01.880226 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:03.880108 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:03.880084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:03.880108 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:03.880096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:03.880469 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:03.880096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:03.880469 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:03.880186 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:03.880469 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:03.880238 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:03.880469 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:03.880315 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:04.079170 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:04.079002 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pr4sw" event={"ID":"556c4304-c27e-49e6-9289-7b8986ec176b","Type":"ContainerStarted","Data":"450ef4812ee90ad9050fe7c2b917fdf84f35ba65e4aa71b225706699c3995b0f"} Apr 22 18:38:04.080402 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:04.080376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" event={"ID":"baab329e26e3fec548046283f03a6805","Type":"ContainerStarted","Data":"0a10d23680ac4dd5dbbf173a0a70c6740e537d7e5ea39db70c923c6c9dc4be12"} Apr 22 18:38:04.081734 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:04.081698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" event={"ID":"174b9358-0c72-4c5e-94ed-aeee18c176a2","Type":"ContainerStarted","Data":"b39b4132fdaeef423735afe61be63fd6e1e5c3f167763c353878891ce8b181b4"} Apr 22 18:38:04.095945 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:04.095906 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pr4sw" podStartSLOduration=1.604600287 podStartE2EDuration="19.095896006s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.195374678 +0000 UTC m=+1.833899779" lastFinishedPulling="2026-04-22 18:38:03.686670382 +0000 UTC m=+19.325195498" observedRunningTime="2026-04-22 18:38:04.095583634 +0000 UTC m=+19.734108756" watchObservedRunningTime="2026-04-22 18:38:04.095896006 +0000 UTC m=+19.734421127" Apr 22 18:38:04.126175 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:04.126121 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8qzs8" podStartSLOduration=1.641520748 podStartE2EDuration="19.126106165s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.201246507 +0000 UTC m=+1.839771607" lastFinishedPulling="2026-04-22 18:38:03.685831923 +0000 UTC m=+19.324357024" observedRunningTime="2026-04-22 18:38:04.111769702 +0000 UTC m=+19.750294826" watchObservedRunningTime="2026-04-22 18:38:04.126106165 +0000 UTC m=+19.764631287" Apr 22 18:38:04.126433 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:04.126411 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-10.ec2.internal" podStartSLOduration=19.126405582 podStartE2EDuration="19.126405582s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:38:04.125828619 +0000 UTC m=+19.764353745" watchObservedRunningTime="2026-04-22 18:38:04.126405582 +0000 UTC m=+19.764930703" Apr 22 18:38:05.084965 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.084803 2577 generic.go:358] "Generic (PLEG): container finished" podID="9a85548356ac6e6ad9bedf610076abee" containerID="f7297b4aec568e11be53f23aade227a698e1545b70e4f519d3d3473246b0fac0" exitCode=0 Apr 22 18:38:05.085670 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.084878 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" event={"ID":"9a85548356ac6e6ad9bedf610076abee","Type":"ContainerDied","Data":"f7297b4aec568e11be53f23aade227a698e1545b70e4f519d3d3473246b0fac0"} Apr 22 18:38:05.086238 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.086221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lpc4b" event={"ID":"247d22a1-a2b2-4af9-b8c2-0bfad6ca0664","Type":"ContainerStarted","Data":"5e524dc59aa6a7c82f734b857c516a8ff4ffc80497b46c427dd74167b601ab65"} Apr 22 18:38:05.087416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.087398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" event={"ID":"728367cd-e906-4445-a992-1b5b0aebdce9","Type":"ContainerStarted","Data":"d41198a1de32cdfc9e7c4823f095ac8f70bec2859466ba069d818ceb8c0ce54e"} Apr 22 18:38:05.088549 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.088527 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-648xg" event={"ID":"4f6716c1-8454-4bb0-a15d-144eeaa62e20","Type":"ContainerStarted","Data":"92d9a646ce7812e603d4b06b6d2c641d4d1b03aa70aaa53b8db6da3695143d38"} Apr 22 18:38:05.089830 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.089811 2577 generic.go:358] "Generic (PLEG): container finished" podID="b4af3b82-6cae-4792-8fe7-cf2daed473d1" containerID="119d926e5344f6c3d0a862a2691bbca6ce49ee5983e85cbfde85822e962b248c" exitCode=0 Apr 22 18:38:05.089920 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.089869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fcffm" event={"ID":"b4af3b82-6cae-4792-8fe7-cf2daed473d1","Type":"ContainerDied","Data":"119d926e5344f6c3d0a862a2691bbca6ce49ee5983e85cbfde85822e962b248c"} Apr 22 18:38:05.091199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.091109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-642nb" event={"ID":"f72f07da-c956-44a1-91e4-efb83a4ae9fc","Type":"ContainerStarted","Data":"661b73370b7bd719544089ceea0d093e3967ba3752ca4dfd005cf972f1c6f277"} Apr 22 18:38:05.092224 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.092206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jpzr8" event={"ID":"27440aa5-4698-4ea7-b7a8-ca0f7994d4e8","Type":"ContainerStarted","Data":"24ac1a789ced3b3e632d2403443c14e87b3034b6dc415a1d6428cd3f66a39a21"} Apr 22 18:38:05.094400 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.094380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"e646cf18aa14ac64d57a65798bbd213a3f6d190cdd1ca6fe95ce1e43106a7d43"} Apr 22 18:38:05.094479 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.094406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"ef0601e7381b7a5090101bc8fa8b84a46479c45238dc6d68af4255d3d1ff07f3"} Apr 22 18:38:05.094479 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.094420 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"863d5ee6648f331b647e442914eb55caf859add2cfceaed0ee22f8eef1d5231a"} Apr 22 18:38:05.094479 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.094432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"1dae5e000c833ad1a66e9b4a62d34847d518448e6f9e9bb9be9d9b69e6eabea7"} Apr 22 18:38:05.094479 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.094444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"ca7295aaededa924c3dc8a6ef3fb21d6c928395d377c6483483f20ed609dab2c"} Apr 22 18:38:05.094479 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.094455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"b5c630c724ab93f44662cccf3cd03ceea5e854faa4d96ab3e6d4889a53b891f8"} Apr 22 18:38:05.102132 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.102099 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lpc4b" podStartSLOduration=2.666665116 podStartE2EDuration="20.102088646s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.214992274 +0000 UTC m=+1.853517373" lastFinishedPulling="2026-04-22 18:38:03.650415797 +0000 UTC m=+19.288940903" observedRunningTime="2026-04-22 18:38:05.10130921 +0000 UTC m=+20.739834332" watchObservedRunningTime="2026-04-22 18:38:05.102088646 +0000 UTC m=+20.740613768" Apr 22 18:38:05.132537 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.132498 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-642nb" podStartSLOduration=2.662941801 podStartE2EDuration="20.132489012s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.176529245 +0000 UTC m=+1.815054348" lastFinishedPulling="2026-04-22 18:38:03.646076444 +0000 UTC m=+19.284601559" observedRunningTime="2026-04-22 18:38:05.132261267 +0000 UTC m=+20.770786389" watchObservedRunningTime="2026-04-22 18:38:05.132489012 +0000 UTC m=+20.771014133" Apr 22 18:38:05.132617 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.132559 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-648xg" podStartSLOduration=3.6144041700000002 podStartE2EDuration="21.132556478s" podCreationTimestamp="2026-04-22 18:37:44 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.127531453 +0000 UTC m=+1.766056553" lastFinishedPulling="2026-04-22 18:38:03.64568376 +0000 UTC m=+19.284208861" observedRunningTime="2026-04-22 18:38:05.11870749 +0000 UTC m=+20.757232611" watchObservedRunningTime="2026-04-22 18:38:05.132556478 +0000 UTC m=+20.771081602" Apr 22 18:38:05.168564 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.168526 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jpzr8" podStartSLOduration=3.664702743 podStartE2EDuration="21.16851568s" podCreationTimestamp="2026-04-22 18:37:44 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.146555085 +0000 UTC m=+1.785080186" lastFinishedPulling="2026-04-22 18:38:03.650368008 +0000 UTC m=+19.288893123" observedRunningTime="2026-04-22 18:38:05.168322665 +0000 UTC m=+20.806847786" watchObservedRunningTime="2026-04-22 18:38:05.16851568 +0000 UTC m=+20.807040801" Apr 22 18:38:05.841033 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.840856 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:38:05.861740 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.861642 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:38:05.841029908Z","UUID":"9b208b42-6a38-457f-9fdd-319297f1cd66","Handler":null,"Name":"","Endpoint":""} Apr 22 18:38:05.864930 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.864903 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:38:05.864930 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.864933 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:38:05.880347 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.880326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:05.880444 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.880326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:05.880481 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:05.880457 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:05.880526 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:05.880505 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:05.880564 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:05.880326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:05.880599 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:05.880591 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:06.098350 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:06.098256 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" event={"ID":"728367cd-e906-4445-a992-1b5b0aebdce9","Type":"ContainerStarted","Data":"fd5373634531392ff8c3d48a58c487e62726a193ffff9463d6e1836db3f0e28a"} Apr 22 18:38:07.103297 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.103251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"f9c8daba5a3ddecd746aa2f14e1a259f070923d3aa01ce2f84dec8735c802b8f"} Apr 22 18:38:07.104940 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.104918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" event={"ID":"9a85548356ac6e6ad9bedf610076abee","Type":"ContainerStarted","Data":"f1c9886d5762748410937b5f602893d6bdd0189f2c24021da3a1d83fcc5a286f"} Apr 22 18:38:07.106896 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.106864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" event={"ID":"728367cd-e906-4445-a992-1b5b0aebdce9","Type":"ContainerStarted","Data":"8ee7eb12eb0c81d8f3c20183621249502c42cc89d068fc4d6a9f3f5c6f8c7bed"} Apr 22 18:38:07.122490 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.120231 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-10.ec2.internal" podStartSLOduration=22.120215512 podStartE2EDuration="22.120215512s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:38:07.119853245 +0000 UTC m=+22.758378368" watchObservedRunningTime="2026-04-22 18:38:07.120215512 +0000 UTC m=+22.758740665" Apr 22 18:38:07.136226 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.136171 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8b6j" podStartSLOduration=2.366843536 podStartE2EDuration="23.136159449s" podCreationTimestamp="2026-04-22 18:37:44 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.150624816 +0000 UTC m=+1.789149916" lastFinishedPulling="2026-04-22 18:38:06.919940725 +0000 UTC m=+22.558465829" observedRunningTime="2026-04-22 18:38:07.135891416 +0000 UTC m=+22.774416543" watchObservedRunningTime="2026-04-22 18:38:07.136159449 +0000 UTC m=+22.774684571" Apr 22 18:38:07.473076 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.473042 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:07.473229 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:07.473175 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:38:07.473270 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:07.473230 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret podName:7d025095-cc31-4ade-becf-5c56f458a510 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:23.473215467 +0000 UTC m=+39.111740567 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret") pod "global-pull-secret-syncer-fgvjk" (UID: "7d025095-cc31-4ade-becf-5c56f458a510") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:38:07.880476 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.880411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:07.880476 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.880466 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:07.880655 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:07.880467 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:07.880655 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:07.880540 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:07.880749 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:07.880653 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:07.880749 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:07.880704 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:08.301307 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:08.301274 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:38:08.302018 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:08.301979 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:38:09.880112 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:09.880077 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:09.880112 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:09.880098 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:09.880623 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:09.880077 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:09.880623 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:09.880203 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:09.880623 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:09.880276 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:09.880623 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:09.880359 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:11.115211 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.115036 2577 generic.go:358] "Generic (PLEG): container finished" podID="b4af3b82-6cae-4792-8fe7-cf2daed473d1" containerID="56df477a480f8c56c7e0c8071538d3a2537770dc530170da819512ea69fd5595" exitCode=0 Apr 22 18:38:11.115693 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.115127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fcffm" event={"ID":"b4af3b82-6cae-4792-8fe7-cf2daed473d1","Type":"ContainerDied","Data":"56df477a480f8c56c7e0c8071538d3a2537770dc530170da819512ea69fd5595"} Apr 22 18:38:11.118392 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.118270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" event={"ID":"bfc4f516-12a3-4b4c-948f-d21348678585","Type":"ContainerStarted","Data":"811903b12da3c6cef6a24337ca849406f379adec2661c1f680b0021e2e71f7e1"} Apr 22 18:38:11.118616 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.118597 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:38:11.118710 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.118624 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:38:11.118710 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.118637 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:38:11.132809 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.132787 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:38:11.132902 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.132873 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:38:11.880068 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.879903 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:11.880215 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.879903 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:11.880215 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:11.880158 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:11.880313 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:11.880214 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:11.880313 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.879989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:11.880313 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:11.880306 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:11.956825 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.956765 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:38:11.956954 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.956907 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:38:11.957288 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.957272 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jpzr8" Apr 22 18:38:11.971852 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:11.971815 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" podStartSLOduration=10.04628779 podStartE2EDuration="27.971802929s" podCreationTimestamp="2026-04-22 18:37:44 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.119035916 +0000 UTC m=+1.757561016" lastFinishedPulling="2026-04-22 18:38:04.044551042 +0000 UTC m=+19.683076155" observedRunningTime="2026-04-22 18:38:11.174243937 +0000 UTC m=+26.812769082" watchObservedRunningTime="2026-04-22 18:38:11.971802929 +0000 UTC m=+27.610328047" Apr 22 18:38:12.126092 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:12.126061 2577 generic.go:358] "Generic (PLEG): container finished" podID="b4af3b82-6cae-4792-8fe7-cf2daed473d1" containerID="cc3916cd97751037fc13119cc30c8fc3abb7d94d8ab7c89de9fb7d802d5e397d" exitCode=0 Apr 22 18:38:12.126585 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:12.126146 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fcffm" event={"ID":"b4af3b82-6cae-4792-8fe7-cf2daed473d1","Type":"ContainerDied","Data":"cc3916cd97751037fc13119cc30c8fc3abb7d94d8ab7c89de9fb7d802d5e397d"} Apr 22 18:38:12.126642 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:12.126588 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cqf5t"] Apr 22 18:38:12.127038 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:12.127023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:12.127152 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:12.127127 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:12.129115 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:12.129089 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jkhtl"] Apr 22 18:38:12.129180 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:12.129146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:12.129220 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:12.129210 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:12.142084 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:12.142063 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fgvjk"] Apr 22 18:38:12.142159 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:12.142136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:12.142212 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:12.142196 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:13.130347 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:13.130319 2577 generic.go:358] "Generic (PLEG): container finished" podID="b4af3b82-6cae-4792-8fe7-cf2daed473d1" containerID="0007c0b99bb7da2f26616911455d467ec440baa79c6afcfe72a3e5158ea25e87" exitCode=0 Apr 22 18:38:13.130899 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:13.130395 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fcffm" event={"ID":"b4af3b82-6cae-4792-8fe7-cf2daed473d1","Type":"ContainerDied","Data":"0007c0b99bb7da2f26616911455d467ec440baa79c6afcfe72a3e5158ea25e87"} Apr 22 18:38:13.880499 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:13.880459 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:13.880499 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:13.880486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:13.880778 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:13.880486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:13.880778 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:13.880577 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:13.880778 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:13.880655 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:13.880778 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:13.880725 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:15.880519 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:15.880485 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:15.881080 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:15.880523 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:15.881080 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:15.880485 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:15.881080 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:15.880612 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqf5t" podUID="e3bffc40-492a-471a-83d2-c9bd203d82a8" Apr 22 18:38:15.881080 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:15.880748 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fgvjk" podUID="7d025095-cc31-4ade-becf-5c56f458a510" Apr 22 18:38:15.881080 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:15.880835 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkhtl" podUID="c50ef4af-f64b-4608-b9e7-126d66048d98" Apr 22 18:38:16.675148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.675115 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-10.ec2.internal" event="NodeReady" Apr 22 18:38:16.675312 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.675262 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:38:16.728799 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.728765 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-khmjv"] Apr 22 18:38:16.760557 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.760529 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-psh89"] Apr 22 18:38:16.760695 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.760682 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.762921 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.762886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:38:16.762921 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.762907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:38:16.763176 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.763160 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cfw2h\"" Apr 22 18:38:16.780010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.779822 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-khmjv"] Apr 22 18:38:16.780116 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.780026 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-psh89"] Apr 22 18:38:16.780116 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.779960 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:16.782253 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.782229 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:38:16.782372 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.782337 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:38:16.782372 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.782348 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:38:16.782372 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.782363 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rz76n\"" Apr 22 18:38:16.846730 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.846686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bae8419-18dc-4dd7-a71d-bacb197e4c26-config-volume\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.846882 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.846785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.846882 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.846815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bae8419-18dc-4dd7-a71d-bacb197e4c26-tmp-dir\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.846882 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.846841 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxb77\" (UniqueName: \"kubernetes.io/projected/7bae8419-18dc-4dd7-a71d-bacb197e4c26-kube-api-access-nxb77\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.948074 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.947988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxm2h\" (UniqueName: \"kubernetes.io/projected/41ba337b-6571-4648-95a1-73c5f1faa37f-kube-api-access-xxm2h\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:16.948074 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.948059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.948574 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.948099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bae8419-18dc-4dd7-a71d-bacb197e4c26-tmp-dir\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.948574 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.948128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxb77\" (UniqueName: \"kubernetes.io/projected/7bae8419-18dc-4dd7-a71d-bacb197e4c26-kube-api-access-nxb77\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.948574 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.948178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:16.948574 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:16.948181 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:16.948574 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.948225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bae8419-18dc-4dd7-a71d-bacb197e4c26-config-volume\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.948574 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:16.948254 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls podName:7bae8419-18dc-4dd7-a71d-bacb197e4c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:17.448233483 +0000 UTC m=+33.086758583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls") pod "dns-default-khmjv" (UID: "7bae8419-18dc-4dd7-a71d-bacb197e4c26") : secret "dns-default-metrics-tls" not found Apr 22 18:38:16.948574 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.948490 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bae8419-18dc-4dd7-a71d-bacb197e4c26-tmp-dir\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.948882 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.948800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bae8419-18dc-4dd7-a71d-bacb197e4c26-config-volume\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:16.959182 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:16.959162 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxb77\" (UniqueName: \"kubernetes.io/projected/7bae8419-18dc-4dd7-a71d-bacb197e4c26-kube-api-access-nxb77\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:17.049343 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.049300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:17.049530 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.049473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxm2h\" (UniqueName: \"kubernetes.io/projected/41ba337b-6571-4648-95a1-73c5f1faa37f-kube-api-access-xxm2h\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:17.049598 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.049476 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:17.049678 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.049655 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert podName:41ba337b-6571-4648-95a1-73c5f1faa37f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:17.549635936 +0000 UTC m=+33.188161044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert") pod "ingress-canary-psh89" (UID: "41ba337b-6571-4648-95a1-73c5f1faa37f") : secret "canary-serving-cert" not found Apr 22 18:38:17.064284 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.064261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxm2h\" (UniqueName: \"kubernetes.io/projected/41ba337b-6571-4648-95a1-73c5f1faa37f-kube-api-access-xxm2h\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:17.452217 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.452184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:17.452484 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.452351 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:17.452484 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.452423 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls podName:7bae8419-18dc-4dd7-a71d-bacb197e4c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:18.452404994 +0000 UTC m=+34.090930098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls") pod "dns-default-khmjv" (UID: "7bae8419-18dc-4dd7-a71d-bacb197e4c26") : secret "dns-default-metrics-tls" not found Apr 22 18:38:17.552895 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.552857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:17.553065 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.552961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:17.553065 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.553045 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:17.553176 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.553098 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert podName:41ba337b-6571-4648-95a1-73c5f1faa37f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:18.553076802 +0000 UTC m=+34.191601902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert") pod "ingress-canary-psh89" (UID: "41ba337b-6571-4648-95a1-73c5f1faa37f") : secret "canary-serving-cert" not found Apr 22 18:38:17.553176 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.553041 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:38:17.553249 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.553189 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs podName:e3bffc40-492a-471a-83d2-c9bd203d82a8 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:49.553170854 +0000 UTC m=+65.191695968 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs") pod "network-metrics-daemon-cqf5t" (UID: "e3bffc40-492a-471a-83d2-c9bd203d82a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:38:17.653955 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.653920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:17.654137 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.654099 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:38:17.654137 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.654123 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:38:17.654137 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.654138 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gd9ct for pod openshift-network-diagnostics/network-check-target-jkhtl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:38:17.654288 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:17.654203 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct podName:c50ef4af-f64b-4608-b9e7-126d66048d98 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:49.654184273 +0000 UTC m=+65.292709386 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gd9ct" (UniqueName: "kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct") pod "network-check-target-jkhtl" (UID: "c50ef4af-f64b-4608-b9e7-126d66048d98") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:38:17.880040 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.879953 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:17.880309 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.879953 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:17.880309 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.879953 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:17.882833 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.882812 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:38:17.883777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.883755 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-stzzq\"" Apr 22 18:38:17.883777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.883767 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:38:17.883939 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.883790 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:38:17.883939 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.883762 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:38:17.883939 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:17.883755 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-djgkc\"" Apr 22 18:38:18.460087 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:18.460046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:18.460590 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:18.460211 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:18.460590 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:18.460274 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls podName:7bae8419-18dc-4dd7-a71d-bacb197e4c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:20.46025778 +0000 UTC m=+36.098782880 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls") pod "dns-default-khmjv" (UID: "7bae8419-18dc-4dd7-a71d-bacb197e4c26") : secret "dns-default-metrics-tls" not found Apr 22 18:38:18.560968 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:18.560933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:18.561116 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:18.561050 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:18.561155 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:18.561144 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert podName:41ba337b-6571-4648-95a1-73c5f1faa37f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:20.561130333 +0000 UTC m=+36.199655433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert") pod "ingress-canary-psh89" (UID: "41ba337b-6571-4648-95a1-73c5f1faa37f") : secret "canary-serving-cert" not found Apr 22 18:38:20.145881 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:20.145847 2577 generic.go:358] "Generic (PLEG): container finished" podID="b4af3b82-6cae-4792-8fe7-cf2daed473d1" containerID="4083f4ae70e894be0ec66a6c05f96d96065ac6be5a7621c5d0c4298c3b0ac525" exitCode=0 Apr 22 18:38:20.146306 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:20.145913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fcffm" event={"ID":"b4af3b82-6cae-4792-8fe7-cf2daed473d1","Type":"ContainerDied","Data":"4083f4ae70e894be0ec66a6c05f96d96065ac6be5a7621c5d0c4298c3b0ac525"} Apr 22 18:38:20.476353 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:20.476321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:20.476508 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:20.476441 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:20.476548 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:20.476515 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls podName:7bae8419-18dc-4dd7-a71d-bacb197e4c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:24.476481154 +0000 UTC m=+40.115006255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls") pod "dns-default-khmjv" (UID: "7bae8419-18dc-4dd7-a71d-bacb197e4c26") : secret "dns-default-metrics-tls" not found Apr 22 18:38:20.577547 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:20.577514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:20.577689 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:20.577624 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:20.577689 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:20.577670 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert podName:41ba337b-6571-4648-95a1-73c5f1faa37f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:24.57765595 +0000 UTC m=+40.216181050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert") pod "ingress-canary-psh89" (UID: "41ba337b-6571-4648-95a1-73c5f1faa37f") : secret "canary-serving-cert" not found Apr 22 18:38:21.150152 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:21.150119 2577 generic.go:358] "Generic (PLEG): container finished" podID="b4af3b82-6cae-4792-8fe7-cf2daed473d1" containerID="bc9cdc70a12b166f25c6e72e6da7af2a214c9dc39afec2c1d22d178e01e61347" exitCode=0 Apr 22 18:38:21.150504 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:21.150164 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fcffm" event={"ID":"b4af3b82-6cae-4792-8fe7-cf2daed473d1","Type":"ContainerDied","Data":"bc9cdc70a12b166f25c6e72e6da7af2a214c9dc39afec2c1d22d178e01e61347"} Apr 22 18:38:22.154849 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:22.154822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fcffm" event={"ID":"b4af3b82-6cae-4792-8fe7-cf2daed473d1","Type":"ContainerStarted","Data":"5f3ae848422655cc2e13785cdf1140a6d4b2cd65d7e506acdb5e611067d6eb70"} Apr 22 18:38:22.179641 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:22.179588 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fcffm" podStartSLOduration=4.2867647 podStartE2EDuration="37.179571477s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="2026-04-22 18:37:46.207202057 +0000 UTC m=+1.845727157" lastFinishedPulling="2026-04-22 18:38:19.100008818 +0000 UTC m=+34.738533934" observedRunningTime="2026-04-22 18:38:22.179201464 +0000 UTC m=+37.817726598" watchObservedRunningTime="2026-04-22 18:38:22.179571477 +0000 UTC m=+37.818096600" Apr 22 18:38:23.497791 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:23.497755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:23.500879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:23.500858 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d025095-cc31-4ade-becf-5c56f458a510-original-pull-secret\") pod \"global-pull-secret-syncer-fgvjk\" (UID: \"7d025095-cc31-4ade-becf-5c56f458a510\") " pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:23.590946 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:23.590908 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fgvjk" Apr 22 18:38:23.749677 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:23.749614 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fgvjk"] Apr 22 18:38:23.753022 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:38:23.752991 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d025095_cc31_4ade_becf_5c56f458a510.slice/crio-e47e96969487311b8a621b752d58ed54ca3b765441654e55fcf860c92060437d WatchSource:0}: Error finding container e47e96969487311b8a621b752d58ed54ca3b765441654e55fcf860c92060437d: Status 404 returned error can't find the container with id e47e96969487311b8a621b752d58ed54ca3b765441654e55fcf860c92060437d Apr 22 18:38:24.159044 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:24.159015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fgvjk" event={"ID":"7d025095-cc31-4ade-becf-5c56f458a510","Type":"ContainerStarted","Data":"e47e96969487311b8a621b752d58ed54ca3b765441654e55fcf860c92060437d"} Apr 22 18:38:24.505775 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:24.505675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:24.506140 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:24.505867 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:24.506140 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:24.505930 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls podName:7bae8419-18dc-4dd7-a71d-bacb197e4c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:32.505913688 +0000 UTC m=+48.144438787 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls") pod "dns-default-khmjv" (UID: "7bae8419-18dc-4dd7-a71d-bacb197e4c26") : secret "dns-default-metrics-tls" not found Apr 22 18:38:24.607088 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:24.607054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:24.607228 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:24.607184 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:24.607281 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:24.607244 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert podName:41ba337b-6571-4648-95a1-73c5f1faa37f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:32.607225153 +0000 UTC m=+48.245750256 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert") pod "ingress-canary-psh89" (UID: "41ba337b-6571-4648-95a1-73c5f1faa37f") : secret "canary-serving-cert" not found Apr 22 18:38:29.169299 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:29.169260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fgvjk" event={"ID":"7d025095-cc31-4ade-becf-5c56f458a510","Type":"ContainerStarted","Data":"228fb832ce45b59e3c8005fdffc3aecbc6389fdbb37a7a584770a49e5178bd94"} Apr 22 18:38:29.185906 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:29.185846 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fgvjk" podStartSLOduration=33.828377462 podStartE2EDuration="38.185829768s" podCreationTimestamp="2026-04-22 18:37:51 +0000 UTC" firstStartedPulling="2026-04-22 18:38:23.754936568 +0000 UTC m=+39.393461676" lastFinishedPulling="2026-04-22 18:38:28.11238888 +0000 UTC m=+43.750913982" observedRunningTime="2026-04-22 18:38:29.184862306 +0000 UTC m=+44.823387429" watchObservedRunningTime="2026-04-22 18:38:29.185829768 +0000 UTC m=+44.824354890" Apr 22 18:38:32.561749 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:32.561692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:32.562216 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:32.561845 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:32.562216 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:32.561943 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls podName:7bae8419-18dc-4dd7-a71d-bacb197e4c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:48.561921394 +0000 UTC m=+64.200446493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls") pod "dns-default-khmjv" (UID: "7bae8419-18dc-4dd7-a71d-bacb197e4c26") : secret "dns-default-metrics-tls" not found Apr 22 18:38:32.662660 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:32.662629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:32.662805 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:32.662762 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:32.662847 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:32.662817 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert podName:41ba337b-6571-4648-95a1-73c5f1faa37f nodeName:}" failed. No retries permitted until 2026-04-22 18:38:48.662803202 +0000 UTC m=+64.301328302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert") pod "ingress-canary-psh89" (UID: "41ba337b-6571-4648-95a1-73c5f1faa37f") : secret "canary-serving-cert" not found Apr 22 18:38:43.140494 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:43.140465 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bm5qp" Apr 22 18:38:48.334784 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.334748 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-765df457d-pff85"] Apr 22 18:38:48.370567 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.370538 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gsnxt"] Apr 22 18:38:48.370737 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.370700 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.372961 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.372935 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:38:48.373066 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.372941 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:38:48.373230 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.373208 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-8sxz9\"" Apr 22 18:38:48.373333 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.373305 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:38:48.373383 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.373336 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:38:48.373427 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.373385 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:38:48.373463 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.373442 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:38:48.397758 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.397733 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gsnxt"] Apr 22 18:38:48.397758 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.397757 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-765df457d-pff85"] Apr 22 18:38:48.397879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.397851 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.400091 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.400077 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:38:48.400172 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.400075 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-b7v2h\"" Apr 22 18:38:48.400386 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.400364 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:38:48.400488 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.400472 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:38:48.400533 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.400495 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:38:48.418065 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.418041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:38:48.421900 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.421881 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f"] Apr 22 18:38:48.442835 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.442812 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f"] Apr 22 18:38:48.442923 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.442900 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.444957 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.444936 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:48.445054 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.444972 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:38:48.445111 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.445067 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:48.445372 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.445347 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zfppc\"" Apr 22 18:38:48.445467 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.445424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:38:48.470339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-tmp\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.470449 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.470449 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-default-certificate\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.470449 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-snapshots\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.470610 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbjhv\" (UniqueName: \"kubernetes.io/projected/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-kube-api-access-mbjhv\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.470610 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470514 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-stats-auth\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.470610 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470567 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.470610 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8vb6\" (UniqueName: \"kubernetes.io/projected/4db562fa-e132-42e2-8767-a511fe5551aa-kube-api-access-k8vb6\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.470791 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-serving-cert\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.470791 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.470791 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-service-ca-bundle\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.470791 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-config\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.470791 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktq6\" (UniqueName: \"kubernetes.io/projected/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-kube-api-access-mktq6\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.470791 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.470773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.571271 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.571369 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-default-certificate\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.571369 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-snapshots\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.571369 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:38:48.571510 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbjhv\" (UniqueName: \"kubernetes.io/projected/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-kube-api-access-mbjhv\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.571510 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:48.571389 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:48.571510 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-stats-auth\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.571510 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:48.571445 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:49.071426154 +0000 UTC m=+64.709951260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : secret "router-metrics-certs-default" not found Apr 22 18:38:48.571510 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.571510 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:48.571494 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:38:48.571510 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8vb6\" (UniqueName: \"kubernetes.io/projected/4db562fa-e132-42e2-8767-a511fe5551aa-kube-api-access-k8vb6\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.571854 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-serving-cert\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.571854 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:48.571557 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls podName:7bae8419-18dc-4dd7-a71d-bacb197e4c26 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:20.571539524 +0000 UTC m=+96.210064626 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls") pod "dns-default-khmjv" (UID: "7bae8419-18dc-4dd7-a71d-bacb197e4c26") : secret "dns-default-metrics-tls" not found Apr 22 18:38:48.571854 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.571854 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-service-ca-bundle\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.571854 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-config\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.571854 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mktq6\" (UniqueName: \"kubernetes.io/projected/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-kube-api-access-mktq6\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.571854 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.571854 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.571770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-tmp\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.572213 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.572042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-tmp\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.572213 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.572081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-snapshots\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.572405 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:48.572390 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:49.072378847 +0000 UTC m=+64.710903947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:48.572633 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.572603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-config\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.572922 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.572904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-service-ca-bundle\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.573036 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.573012 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.574078 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.574059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-default-certificate\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.574166 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.574144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-serving-cert\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.574337 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.574318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-stats-auth\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.574421 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.574405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.580709 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.580685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8vb6\" (UniqueName: \"kubernetes.io/projected/4db562fa-e132-42e2-8767-a511fe5551aa-kube-api-access-k8vb6\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:48.581062 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.581042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbjhv\" (UniqueName: \"kubernetes.io/projected/c98406ec-5c79-4a8a-b6e6-80a29ff6a160-kube-api-access-mbjhv\") pod \"service-ca-operator-d6fc45fc5-2g98f\" (UID: \"c98406ec-5c79-4a8a-b6e6-80a29ff6a160\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.593557 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.593511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktq6\" (UniqueName: \"kubernetes.io/projected/ffeaef2c-e524-4aff-b6b1-3a7e61159f09-kube-api-access-mktq6\") pod \"insights-operator-585dfdc468-gsnxt\" (UID: \"ffeaef2c-e524-4aff-b6b1-3a7e61159f09\") " pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.672934 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.672915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:38:48.673063 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:48.673045 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:38:48.673110 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:48.673096 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert podName:41ba337b-6571-4648-95a1-73c5f1faa37f nodeName:}" failed. No retries permitted until 2026-04-22 18:39:20.67308356 +0000 UTC m=+96.311608664 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert") pod "ingress-canary-psh89" (UID: "41ba337b-6571-4648-95a1-73c5f1faa37f") : secret "canary-serving-cert" not found Apr 22 18:38:48.706439 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.706419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gsnxt" Apr 22 18:38:48.759078 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.756415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" Apr 22 18:38:48.847622 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.847566 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gsnxt"] Apr 22 18:38:48.850647 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:38:48.850621 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffeaef2c_e524_4aff_b6b1_3a7e61159f09.slice/crio-152f52d8772695af4ba466fc594fb6226f9d28174f0306efaad55142dc1d9912 WatchSource:0}: Error finding container 152f52d8772695af4ba466fc594fb6226f9d28174f0306efaad55142dc1d9912: Status 404 returned error can't find the container with id 152f52d8772695af4ba466fc594fb6226f9d28174f0306efaad55142dc1d9912 Apr 22 18:38:48.882360 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:48.882332 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f"] Apr 22 18:38:48.885201 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:38:48.885178 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc98406ec_5c79_4a8a_b6e6_80a29ff6a160.slice/crio-0f3daaad3e325dd33dc0742bbee5604a1f752b2bc877cfa34f021958f4a02edc WatchSource:0}: Error finding container 0f3daaad3e325dd33dc0742bbee5604a1f752b2bc877cfa34f021958f4a02edc: Status 404 returned error can't find the container with id 0f3daaad3e325dd33dc0742bbee5604a1f752b2bc877cfa34f021958f4a02edc Apr 22 18:38:49.080514 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.080486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:49.080650 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.080551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:49.080650 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:49.080639 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:49.080739 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:49.080648 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:50.080631099 +0000 UTC m=+65.719156206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:49.080739 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:49.080704 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:50.080688219 +0000 UTC m=+65.719213322 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : secret "router-metrics-certs-default" not found Apr 22 18:38:49.206949 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.206917 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" event={"ID":"c98406ec-5c79-4a8a-b6e6-80a29ff6a160","Type":"ContainerStarted","Data":"0f3daaad3e325dd33dc0742bbee5604a1f752b2bc877cfa34f021958f4a02edc"} Apr 22 18:38:49.207817 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.207798 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gsnxt" event={"ID":"ffeaef2c-e524-4aff-b6b1-3a7e61159f09","Type":"ContainerStarted","Data":"152f52d8772695af4ba466fc594fb6226f9d28174f0306efaad55142dc1d9912"} Apr 22 18:38:49.585474 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.585397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:38:49.588390 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.588364 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:38:49.596631 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:49.596610 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:38:49.596745 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:49.596668 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs podName:e3bffc40-492a-471a-83d2-c9bd203d82a8 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:53.596653994 +0000 UTC m=+129.235179098 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs") pod "network-metrics-daemon-cqf5t" (UID: "e3bffc40-492a-471a-83d2-c9bd203d82a8") : secret "metrics-daemon-secret" not found Apr 22 18:38:49.686510 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.686477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:49.689365 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.689345 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:38:49.699492 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.699460 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:38:49.710530 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:49.710509 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd9ct\" (UniqueName: \"kubernetes.io/projected/c50ef4af-f64b-4608-b9e7-126d66048d98-kube-api-access-gd9ct\") pod \"network-check-target-jkhtl\" (UID: \"c50ef4af-f64b-4608-b9e7-126d66048d98\") " pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:50.000031 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:50.000001 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-djgkc\"" Apr 22 18:38:50.008062 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:50.008028 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:50.090835 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:50.090808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:50.090994 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:50.090901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:50.091142 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:50.091057 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:52.091036874 +0000 UTC m=+67.729561989 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:50.091257 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:50.091151 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:50.091257 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:50.091202 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:52.091187344 +0000 UTC m=+67.729712463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : secret "router-metrics-certs-default" not found Apr 22 18:38:50.138255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:50.138227 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jkhtl"] Apr 22 18:38:50.142334 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:38:50.142303 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50ef4af_f64b_4608_b9e7_126d66048d98.slice/crio-c5e06109e710362d1cb2bb2192eea81c7cd3925811cf0f926f6410fb66c8a650 WatchSource:0}: Error finding container c5e06109e710362d1cb2bb2192eea81c7cd3925811cf0f926f6410fb66c8a650: Status 404 returned error can't find the container with id c5e06109e710362d1cb2bb2192eea81c7cd3925811cf0f926f6410fb66c8a650 Apr 22 18:38:50.211551 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:50.211515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jkhtl" event={"ID":"c50ef4af-f64b-4608-b9e7-126d66048d98","Type":"ContainerStarted","Data":"c5e06109e710362d1cb2bb2192eea81c7cd3925811cf0f926f6410fb66c8a650"} Apr 22 18:38:52.106701 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:52.106659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:52.107108 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:52.106772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:52.107108 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:52.106840 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:52.107108 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:52.106924 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:56.106899226 +0000 UTC m=+71.745424330 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : secret "router-metrics-certs-default" not found Apr 22 18:38:52.107108 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:52.106950 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:56.106931719 +0000 UTC m=+71.745456820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:52.221146 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:52.221090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" event={"ID":"c98406ec-5c79-4a8a-b6e6-80a29ff6a160","Type":"ContainerStarted","Data":"7fd1abc4f6012db6b167f08fbce837180140fb1f699c4788bc2beabd5187ed94"} Apr 22 18:38:52.223116 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:52.223087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gsnxt" event={"ID":"ffeaef2c-e524-4aff-b6b1-3a7e61159f09","Type":"ContainerStarted","Data":"937cce3cd8637edb766c2083acaaf20938f1784a9b5057421bb9b4e478098cce"} Apr 22 18:38:52.240817 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:52.240775 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" podStartSLOduration=1.6555040330000002 podStartE2EDuration="4.240763561s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.886813931 +0000 UTC m=+64.525339030" lastFinishedPulling="2026-04-22 18:38:51.472073454 +0000 UTC m=+67.110598558" observedRunningTime="2026-04-22 18:38:52.240455299 +0000 UTC m=+67.878980425" watchObservedRunningTime="2026-04-22 18:38:52.240763561 +0000 UTC m=+67.879288683" Apr 22 18:38:52.255898 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:52.255848 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-gsnxt" podStartSLOduration=1.636131971 podStartE2EDuration="4.255831123s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.85253744 +0000 UTC m=+64.491062541" lastFinishedPulling="2026-04-22 18:38:51.472236582 +0000 UTC m=+67.110761693" observedRunningTime="2026-04-22 18:38:52.254805557 +0000 UTC m=+67.893330680" watchObservedRunningTime="2026-04-22 18:38:52.255831123 +0000 UTC m=+67.894356247" Apr 22 18:38:54.228730 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:54.228694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jkhtl" event={"ID":"c50ef4af-f64b-4608-b9e7-126d66048d98","Type":"ContainerStarted","Data":"ac001a8a3248d1eb61fb6d8c51a8663cab0d19018de121b7ffc192681daf43c6"} Apr 22 18:38:54.229211 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:54.228815 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:38:54.245265 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:54.245220 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jkhtl" podStartSLOduration=65.524435997 podStartE2EDuration="1m9.245207946s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="2026-04-22 18:38:50.145259027 +0000 UTC m=+65.783784132" lastFinishedPulling="2026-04-22 18:38:53.866030981 +0000 UTC m=+69.504556081" observedRunningTime="2026-04-22 18:38:54.244110575 +0000 UTC m=+69.882635696" watchObservedRunningTime="2026-04-22 18:38:54.245207946 +0000 UTC m=+69.883733067" Apr 22 18:38:54.529431 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:54.529359 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-642nb_f72f07da-c956-44a1-91e4-efb83a4ae9fc/dns-node-resolver/0.log" Apr 22 18:38:55.529118 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:55.529091 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-648xg_4f6716c1-8454-4bb0-a15d-144eeaa62e20/node-ca/0.log" Apr 22 18:38:56.137356 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:56.137323 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:56.137557 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:38:56.137417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:38:56.137557 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:56.137473 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:38:56.137557 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:56.137539 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:39:04.137522248 +0000 UTC m=+79.776047348 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : configmap references non-existent config key: service-ca.crt Apr 22 18:38:56.137557 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:38:56.137559 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:39:04.137549337 +0000 UTC m=+79.776074437 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : secret "router-metrics-certs-default" not found Apr 22 18:39:04.193531 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:04.193410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:04.193531 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:04.193489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:04.193531 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:04.193506 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:39:20.193486886 +0000 UTC m=+95.832011991 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : configmap references non-existent config key: service-ca.crt Apr 22 18:39:04.193980 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:04.193563 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:39:04.193980 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:04.193649 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs podName:4db562fa-e132-42e2-8767-a511fe5551aa nodeName:}" failed. No retries permitted until 2026-04-22 18:39:20.193634101 +0000 UTC m=+95.832159201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs") pod "router-default-765df457d-pff85" (UID: "4db562fa-e132-42e2-8767-a511fe5551aa") : secret "router-metrics-certs-default" not found Apr 22 18:39:16.596969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.596764 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6dq6k"] Apr 22 18:39:16.599463 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.599447 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.603060 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.603037 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-x7wl2\"" Apr 22 18:39:16.603281 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.603268 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:39:16.603328 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.603302 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:39:16.618335 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.618311 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6dq6k"] Apr 22 18:39:16.627598 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.627580 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2"] Apr 22 18:39:16.629194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.629181 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" Apr 22 18:39:16.632527 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.632510 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:39:16.632609 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.632517 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:39:16.632967 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.632954 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2z4l9\"" Apr 22 18:39:16.643361 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.643339 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2"] Apr 22 18:39:16.683898 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.683878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-data-volume\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.684017 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.683911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.684089 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.684041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-crio-socket\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.684148 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.684128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.684190 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.684156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26zwk\" (UniqueName: \"kubernetes.io/projected/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-kube-api-access-26zwk\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.724623 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.724597 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-l4mxm"] Apr 22 18:39:16.726347 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.726332 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-l4mxm" Apr 22 18:39:16.731639 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.731619 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-m8dst\"" Apr 22 18:39:16.732205 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.732185 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:39:16.732523 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.732505 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:39:16.746601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.746580 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-l4mxm"] Apr 22 18:39:16.763248 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.763230 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fd45d456b-kdh4x"] Apr 22 18:39:16.765096 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.765081 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.773283 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.773264 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:39:16.774969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.774946 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:39:16.775208 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.775185 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fbmmk\"" Apr 22 18:39:16.775328 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.775223 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:39:16.779573 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.779555 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fd45d456b-kdh4x"] Apr 22 18:39:16.779820 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.779804 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:39:16.785180 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d5bda99-21d1-4ac5-ab04-13b17c683ad1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fvpf2\" (UID: \"7d5bda99-21d1-4ac5-ab04-13b17c683ad1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" Apr 22 18:39:16.785286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.785286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-crio-socket\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.785400 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785320 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26zwk\" (UniqueName: \"kubernetes.io/projected/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-kube-api-access-26zwk\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.785400 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7d5bda99-21d1-4ac5-ab04-13b17c683ad1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fvpf2\" (UID: \"7d5bda99-21d1-4ac5-ab04-13b17c683ad1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" Apr 22 18:39:16.785492 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-data-volume\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.785492 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-crio-socket\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.785492 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.785734 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785706 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-data-volume\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.785985 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.785967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.787650 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.787629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.800974 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.800946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26zwk\" (UniqueName: \"kubernetes.io/projected/f1033c20-6786-4aff-89a9-a9bf1b0c3ed8-kube-api-access-26zwk\") pod \"insights-runtime-extractor-6dq6k\" (UID: \"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8\") " pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.885890 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.885827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7d5bda99-21d1-4ac5-ab04-13b17c683ad1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fvpf2\" (UID: \"7d5bda99-21d1-4ac5-ab04-13b17c683ad1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" Apr 22 18:39:16.885890 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.885858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-registry-certificates\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.885890 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.885876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-bound-sa-token\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.886067 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.885893 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-image-registry-private-configuration\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.886067 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.885943 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-registry-tls\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.886067 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.886008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d5bda99-21d1-4ac5-ab04-13b17c683ad1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fvpf2\" (UID: \"7d5bda99-21d1-4ac5-ab04-13b17c683ad1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" Apr 22 18:39:16.886067 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.886029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2fm\" (UniqueName: \"kubernetes.io/projected/82fb16f8-4657-404e-b89a-3a3762417871-kube-api-access-lm2fm\") pod \"downloads-6bcc868b7-l4mxm\" (UID: \"82fb16f8-4657-404e-b89a-3a3762417871\") " pod="openshift-console/downloads-6bcc868b7-l4mxm" Apr 22 18:39:16.886067 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.886047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-ca-trust-extracted\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.886067 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.886064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krklg\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-kube-api-access-krklg\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.886275 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.886134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-installation-pull-secrets\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.886275 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.886171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-trusted-ca\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.886515 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.886498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d5bda99-21d1-4ac5-ab04-13b17c683ad1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fvpf2\" (UID: \"7d5bda99-21d1-4ac5-ab04-13b17c683ad1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" Apr 22 18:39:16.890205 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.890177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7d5bda99-21d1-4ac5-ab04-13b17c683ad1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fvpf2\" (UID: \"7d5bda99-21d1-4ac5-ab04-13b17c683ad1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" Apr 22 18:39:16.908074 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.908029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6dq6k" Apr 22 18:39:16.937239 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.937214 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.986987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-registry-certificates\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-bound-sa-token\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-image-registry-private-configuration\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-registry-tls\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2fm\" (UniqueName: \"kubernetes.io/projected/82fb16f8-4657-404e-b89a-3a3762417871-kube-api-access-lm2fm\") pod \"downloads-6bcc868b7-l4mxm\" (UID: \"82fb16f8-4657-404e-b89a-3a3762417871\") " pod="openshift-console/downloads-6bcc868b7-l4mxm" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-ca-trust-extracted\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krklg\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-kube-api-access-krklg\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-installation-pull-secrets\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.987689 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-trusted-ca\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.988283 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.987937 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-ca-trust-extracted\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.988478 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.988446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-registry-certificates\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.988740 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.988700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-trusted-ca\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.990163 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.990118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-installation-pull-secrets\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.990163 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.990120 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-registry-tls\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.990748 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.990707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-image-registry-private-configuration\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:16.997275 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:16.997204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-bound-sa-token\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:17.003593 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.003570 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2fm\" (UniqueName: \"kubernetes.io/projected/82fb16f8-4657-404e-b89a-3a3762417871-kube-api-access-lm2fm\") pod \"downloads-6bcc868b7-l4mxm\" (UID: \"82fb16f8-4657-404e-b89a-3a3762417871\") " pod="openshift-console/downloads-6bcc868b7-l4mxm" Apr 22 18:39:17.005217 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.005190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krklg\" (UniqueName: \"kubernetes.io/projected/7f399e35-f80e-49ff-9647-c3b6fbfa8e04-kube-api-access-krklg\") pod \"image-registry-7fd45d456b-kdh4x\" (UID: \"7f399e35-f80e-49ff-9647-c3b6fbfa8e04\") " pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:17.034273 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.034245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-l4mxm" Apr 22 18:39:17.050313 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.050287 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6dq6k"] Apr 22 18:39:17.053664 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:17.053636 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1033c20_6786_4aff_89a9_a9bf1b0c3ed8.slice/crio-b4c6ed3d79fe4334d85fc1d991df524617696e9b58112caee18c4667a9242298 WatchSource:0}: Error finding container b4c6ed3d79fe4334d85fc1d991df524617696e9b58112caee18c4667a9242298: Status 404 returned error can't find the container with id b4c6ed3d79fe4334d85fc1d991df524617696e9b58112caee18c4667a9242298 Apr 22 18:39:17.073373 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.073348 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:17.082444 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.082401 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2"] Apr 22 18:39:17.088393 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:17.088370 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d5bda99_21d1_4ac5_ab04_13b17c683ad1.slice/crio-e36fec1f52ca3a6963d37f4de313149b950428504c78d4dec4d5546e9a9dd73a WatchSource:0}: Error finding container e36fec1f52ca3a6963d37f4de313149b950428504c78d4dec4d5546e9a9dd73a: Status 404 returned error can't find the container with id e36fec1f52ca3a6963d37f4de313149b950428504c78d4dec4d5546e9a9dd73a Apr 22 18:39:17.161787 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.161707 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-l4mxm"] Apr 22 18:39:17.165612 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:17.165582 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82fb16f8_4657_404e_b89a_3a3762417871.slice/crio-54d1a415aca74fffa2d167470508a2a81237a888b4872f0d89ff7dd0d35ab7ac WatchSource:0}: Error finding container 54d1a415aca74fffa2d167470508a2a81237a888b4872f0d89ff7dd0d35ab7ac: Status 404 returned error can't find the container with id 54d1a415aca74fffa2d167470508a2a81237a888b4872f0d89ff7dd0d35ab7ac Apr 22 18:39:17.210118 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.210092 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fd45d456b-kdh4x"] Apr 22 18:39:17.213348 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:17.213317 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f399e35_f80e_49ff_9647_c3b6fbfa8e04.slice/crio-a0dfe33b8eb322cfc322d906908653f62452d1a3c2d99110d4976feb70c73770 WatchSource:0}: Error finding container a0dfe33b8eb322cfc322d906908653f62452d1a3c2d99110d4976feb70c73770: Status 404 returned error can't find the container with id a0dfe33b8eb322cfc322d906908653f62452d1a3c2d99110d4976feb70c73770 Apr 22 18:39:17.273961 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.273937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" event={"ID":"7f399e35-f80e-49ff-9647-c3b6fbfa8e04","Type":"ContainerStarted","Data":"f49c7f2a5a7da7e2d55ba21dfa44934d259bad7bacd65fa14420c8e75923b68f"} Apr 22 18:39:17.274063 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.273972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" event={"ID":"7f399e35-f80e-49ff-9647-c3b6fbfa8e04","Type":"ContainerStarted","Data":"a0dfe33b8eb322cfc322d906908653f62452d1a3c2d99110d4976feb70c73770"} Apr 22 18:39:17.274063 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.274012 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:17.275036 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.275010 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-l4mxm" event={"ID":"82fb16f8-4657-404e-b89a-3a3762417871","Type":"ContainerStarted","Data":"54d1a415aca74fffa2d167470508a2a81237a888b4872f0d89ff7dd0d35ab7ac"} Apr 22 18:39:17.275959 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.275939 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" event={"ID":"7d5bda99-21d1-4ac5-ab04-13b17c683ad1","Type":"ContainerStarted","Data":"e36fec1f52ca3a6963d37f4de313149b950428504c78d4dec4d5546e9a9dd73a"} Apr 22 18:39:17.277165 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.277146 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6dq6k" event={"ID":"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8","Type":"ContainerStarted","Data":"bb0a5b50c7176626a00ff91085043e33b69d273afdf4cd360e9222c8f19e09e8"} Apr 22 18:39:17.277245 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.277169 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6dq6k" event={"ID":"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8","Type":"ContainerStarted","Data":"b4c6ed3d79fe4334d85fc1d991df524617696e9b58112caee18c4667a9242298"} Apr 22 18:39:17.293783 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:17.293747 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" podStartSLOduration=1.293736343 podStartE2EDuration="1.293736343s" podCreationTimestamp="2026-04-22 18:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:39:17.292995717 +0000 UTC m=+92.931520874" watchObservedRunningTime="2026-04-22 18:39:17.293736343 +0000 UTC m=+92.932261465" Apr 22 18:39:18.281107 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:18.281056 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" event={"ID":"7d5bda99-21d1-4ac5-ab04-13b17c683ad1","Type":"ContainerStarted","Data":"213d37357d75ccb90154cdd0f053df0b1e9a3d2e99bcd46a6b6720a37bffa825"} Apr 22 18:39:18.283211 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:18.283179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6dq6k" event={"ID":"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8","Type":"ContainerStarted","Data":"a1b9383b4d8268fd8f307fd02aacfd8a5be12a1f02893c656802473c3084494e"} Apr 22 18:39:18.297231 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:18.297182 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvpf2" podStartSLOduration=1.297150612 podStartE2EDuration="2.297164767s" podCreationTimestamp="2026-04-22 18:39:16 +0000 UTC" firstStartedPulling="2026-04-22 18:39:17.090538634 +0000 UTC m=+92.729063741" lastFinishedPulling="2026-04-22 18:39:18.090552784 +0000 UTC m=+93.729077896" observedRunningTime="2026-04-22 18:39:18.296699473 +0000 UTC m=+93.935224596" watchObservedRunningTime="2026-04-22 18:39:18.297164767 +0000 UTC m=+93.935689893" Apr 22 18:39:20.215656 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.215614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:20.216035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.215677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:20.216478 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.216447 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db562fa-e132-42e2-8767-a511fe5551aa-service-ca-bundle\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:20.218259 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.218232 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db562fa-e132-42e2-8767-a511fe5551aa-metrics-certs\") pod \"router-default-765df457d-pff85\" (UID: \"4db562fa-e132-42e2-8767-a511fe5551aa\") " pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:20.291305 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.291269 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6dq6k" event={"ID":"f1033c20-6786-4aff-89a9-a9bf1b0c3ed8","Type":"ContainerStarted","Data":"4b6850820f5ed77c01e5280a1ba4449e9d6656488c3b0c32204007ac294709d7"} Apr 22 18:39:20.310136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.310094 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6dq6k" podStartSLOduration=2.133082249 podStartE2EDuration="4.310076294s" podCreationTimestamp="2026-04-22 18:39:16 +0000 UTC" firstStartedPulling="2026-04-22 18:39:17.118208988 +0000 UTC m=+92.756734096" lastFinishedPulling="2026-04-22 18:39:19.295203025 +0000 UTC m=+94.933728141" observedRunningTime="2026-04-22 18:39:20.308278974 +0000 UTC m=+95.946804099" watchObservedRunningTime="2026-04-22 18:39:20.310076294 +0000 UTC m=+95.948601415" Apr 22 18:39:20.479048 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.478969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:20.606594 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.606558 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-765df457d-pff85"] Apr 22 18:39:20.609381 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:20.609351 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4db562fa_e132_42e2_8767_a511fe5551aa.slice/crio-21da83b0227d3480f26cd098e3487bcaeaf21299ce38ad031bd98eb156796b36 WatchSource:0}: Error finding container 21da83b0227d3480f26cd098e3487bcaeaf21299ce38ad031bd98eb156796b36: Status 404 returned error can't find the container with id 21da83b0227d3480f26cd098e3487bcaeaf21299ce38ad031bd98eb156796b36 Apr 22 18:39:20.618096 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.618062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:39:20.620660 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.620633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bae8419-18dc-4dd7-a71d-bacb197e4c26-metrics-tls\") pod \"dns-default-khmjv\" (UID: \"7bae8419-18dc-4dd7-a71d-bacb197e4c26\") " pod="openshift-dns/dns-default-khmjv" Apr 22 18:39:20.676548 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.676525 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cfw2h\"" Apr 22 18:39:20.684881 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.684856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-khmjv" Apr 22 18:39:20.718696 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.718657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:39:20.721396 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.721356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41ba337b-6571-4648-95a1-73c5f1faa37f-cert\") pod \"ingress-canary-psh89\" (UID: \"41ba337b-6571-4648-95a1-73c5f1faa37f\") " pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:39:20.815283 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.815251 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-khmjv"] Apr 22 18:39:20.818611 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:20.818581 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bae8419_18dc_4dd7_a71d_bacb197e4c26.slice/crio-3a63b86076a85d5a84502ec13b2454dbe278ed7342ffaf8264eaf33f89db5f32 WatchSource:0}: Error finding container 3a63b86076a85d5a84502ec13b2454dbe278ed7342ffaf8264eaf33f89db5f32: Status 404 returned error can't find the container with id 3a63b86076a85d5a84502ec13b2454dbe278ed7342ffaf8264eaf33f89db5f32 Apr 22 18:39:20.991672 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.991595 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rz76n\"" Apr 22 18:39:20.999512 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:20.999486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-psh89" Apr 22 18:39:21.128944 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:21.128886 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-psh89"] Apr 22 18:39:21.134022 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:21.133991 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41ba337b_6571_4648_95a1_73c5f1faa37f.slice/crio-2d381323bb7a44d639aa55ec9a6436796985a507d33554a9a24dee89e6320043 WatchSource:0}: Error finding container 2d381323bb7a44d639aa55ec9a6436796985a507d33554a9a24dee89e6320043: Status 404 returned error can't find the container with id 2d381323bb7a44d639aa55ec9a6436796985a507d33554a9a24dee89e6320043 Apr 22 18:39:21.295321 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:21.295238 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-psh89" event={"ID":"41ba337b-6571-4648-95a1-73c5f1faa37f","Type":"ContainerStarted","Data":"2d381323bb7a44d639aa55ec9a6436796985a507d33554a9a24dee89e6320043"} Apr 22 18:39:21.296309 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:21.296275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-khmjv" event={"ID":"7bae8419-18dc-4dd7-a71d-bacb197e4c26","Type":"ContainerStarted","Data":"3a63b86076a85d5a84502ec13b2454dbe278ed7342ffaf8264eaf33f89db5f32"} Apr 22 18:39:21.297752 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:21.297711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-765df457d-pff85" event={"ID":"4db562fa-e132-42e2-8767-a511fe5551aa","Type":"ContainerStarted","Data":"0370c6a2fc1c1c6e8fede308cef85d39527c040012969111546c90f505ed54da"} Apr 22 18:39:21.297863 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:21.297760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-765df457d-pff85" event={"ID":"4db562fa-e132-42e2-8767-a511fe5551aa","Type":"ContainerStarted","Data":"21da83b0227d3480f26cd098e3487bcaeaf21299ce38ad031bd98eb156796b36"} Apr 22 18:39:21.318842 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:21.318798 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-765df457d-pff85" podStartSLOduration=33.318781401 podStartE2EDuration="33.318781401s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:39:21.317617801 +0000 UTC m=+96.956142927" watchObservedRunningTime="2026-04-22 18:39:21.318781401 +0000 UTC m=+96.957306525" Apr 22 18:39:21.479994 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:21.479953 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:21.482838 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:21.482812 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:22.302539 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:22.302490 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:22.304183 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:22.304008 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-765df457d-pff85" Apr 22 18:39:23.306909 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:23.306866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-khmjv" event={"ID":"7bae8419-18dc-4dd7-a71d-bacb197e4c26","Type":"ContainerStarted","Data":"d76da749c4f4089e44501fd8936208dd027c0e8de8dd9ccfbb3a0f4b78ced826"} Apr 22 18:39:23.306909 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:23.306914 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-khmjv" event={"ID":"7bae8419-18dc-4dd7-a71d-bacb197e4c26","Type":"ContainerStarted","Data":"d42c577fcf17cf74985f0e3f8e4f4a360c1569965ffd0a39b4f701096d1a1f1a"} Apr 22 18:39:23.324390 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:23.324335 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-khmjv" podStartSLOduration=65.756705589 podStartE2EDuration="1m7.324319185s" podCreationTimestamp="2026-04-22 18:38:16 +0000 UTC" firstStartedPulling="2026-04-22 18:39:20.820551166 +0000 UTC m=+96.459076266" lastFinishedPulling="2026-04-22 18:39:22.38816475 +0000 UTC m=+98.026689862" observedRunningTime="2026-04-22 18:39:23.323528571 +0000 UTC m=+98.962053694" watchObservedRunningTime="2026-04-22 18:39:23.324319185 +0000 UTC m=+98.962844307" Apr 22 18:39:24.310735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:24.310677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-psh89" event={"ID":"41ba337b-6571-4648-95a1-73c5f1faa37f","Type":"ContainerStarted","Data":"0872cd8babb51aa80cac654dd7919afd3a780b69afb74590a3ef28728061be52"} Apr 22 18:39:24.311259 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:24.311046 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-khmjv" Apr 22 18:39:24.326964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:24.326911 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-psh89" podStartSLOduration=65.806712983 podStartE2EDuration="1m8.326897116s" podCreationTimestamp="2026-04-22 18:38:16 +0000 UTC" firstStartedPulling="2026-04-22 18:39:21.13655193 +0000 UTC m=+96.775077044" lastFinishedPulling="2026-04-22 18:39:23.656736077 +0000 UTC m=+99.295261177" observedRunningTime="2026-04-22 18:39:24.325700268 +0000 UTC m=+99.964225390" watchObservedRunningTime="2026-04-22 18:39:24.326897116 +0000 UTC m=+99.965422238" Apr 22 18:39:25.233004 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:25.232976 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jkhtl" Apr 22 18:39:30.123410 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.123371 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8"] Apr 22 18:39:30.127154 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.127130 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.129948 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.129927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:39:30.130078 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.129927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:39:30.130326 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.130305 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qqsrj"] Apr 22 18:39:30.130707 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.130688 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-lvjpr\"" Apr 22 18:39:30.130832 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.130728 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:39:30.130832 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.130696 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:39:30.130947 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.130691 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:39:30.133846 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.133827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.138388 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.138369 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:39:30.138511 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.138370 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:39:30.138588 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.138511 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-2jz9q\"" Apr 22 18:39:30.138678 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.138661 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:39:30.142672 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.142640 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8"] Apr 22 18:39:30.151599 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.151580 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qqsrj"] Apr 22 18:39:30.172441 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.172415 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hjrlm"] Apr 22 18:39:30.175989 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.175968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.178119 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.178098 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:39:30.178259 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.178241 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:39:30.178483 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.178459 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mt2n7\"" Apr 22 18:39:30.178894 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.178868 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:39:30.302001 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.301967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d37dbf7-4750-4106-9017-8187fc45ab69-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.302001 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-root\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302254 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c68dd5e-6d55-44da-b104-a82c798b9b6f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.302254 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302038 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-tls\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302254 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.302254 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb8ld\" (UniqueName: \"kubernetes.io/projected/8d37dbf7-4750-4106-9017-8187fc45ab69-kube-api-access-wb8ld\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.302254 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-wtmp\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk7nr\" (UniqueName: \"kubernetes.io/projected/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-api-access-wk7nr\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.302508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.302508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.302508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.302508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302445 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07af08e5-baae-4347-8c0e-109a222d35de-metrics-client-ca\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jhjv\" (UniqueName: \"kubernetes.io/projected/07af08e5-baae-4347-8c0e-109a222d35de-kube-api-access-7jhjv\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302504 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302920 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.302920 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-textfile\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302920 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-sys\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.302920 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.302622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3c68dd5e-6d55-44da-b104-a82c798b9b6f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.403796 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.403753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-sys\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.403969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.403823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3c68dd5e-6d55-44da-b104-a82c798b9b6f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.403969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.403870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d37dbf7-4750-4106-9017-8187fc45ab69-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.403969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.403899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-sys\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.403969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.403914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-root\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.403969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.403963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-root\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.404247 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.403978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c68dd5e-6d55-44da-b104-a82c798b9b6f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.404247 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-tls\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.404247 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.404247 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb8ld\" (UniqueName: \"kubernetes.io/projected/8d37dbf7-4750-4106-9017-8187fc45ab69-kube-api-access-wb8ld\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.404247 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:30.404188 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:39:30.404467 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:30.404264 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-tls podName:07af08e5-baae-4347-8c0e-109a222d35de nodeName:}" failed. No retries permitted until 2026-04-22 18:39:30.904240299 +0000 UTC m=+106.542765405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-tls") pod "node-exporter-hjrlm" (UID: "07af08e5-baae-4347-8c0e-109a222d35de") : secret "node-exporter-tls" not found Apr 22 18:39:30.404467 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3c68dd5e-6d55-44da-b104-a82c798b9b6f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.404467 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-wtmp\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.404467 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk7nr\" (UniqueName: \"kubernetes.io/projected/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-api-access-wk7nr\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.404467 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07af08e5-baae-4347-8c0e-109a222d35de-metrics-client-ca\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jhjv\" (UniqueName: \"kubernetes.io/projected/07af08e5-baae-4347-8c0e-109a222d35de-kube-api-access-7jhjv\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404565 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-textfile\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.404735 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d37dbf7-4750-4106-9017-8187fc45ab69-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.405182 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.404810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-wtmp\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.405182 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:30.405105 2577 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 18:39:30.405182 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:30.405178 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-tls podName:8d37dbf7-4750-4106-9017-8187fc45ab69 nodeName:}" failed. No retries permitted until 2026-04-22 18:39:30.905156621 +0000 UTC m=+106.543681739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-tr5s8" (UID: "8d37dbf7-4750-4106-9017-8187fc45ab69") : secret "openshift-state-metrics-tls" not found Apr 22 18:39:30.405349 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.405227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c68dd5e-6d55-44da-b104-a82c798b9b6f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.405349 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:30.405311 2577 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 18:39:30.405442 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.405354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.405442 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:39:30.405376 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-tls podName:3c68dd5e-6d55-44da-b104-a82c798b9b6f nodeName:}" failed. No retries permitted until 2026-04-22 18:39:30.905362599 +0000 UTC m=+106.543887702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-qqsrj" (UID: "3c68dd5e-6d55-44da-b104-a82c798b9b6f") : secret "kube-state-metrics-tls" not found Apr 22 18:39:30.405442 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.405423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.405593 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.405438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-textfile\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.406210 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.406183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07af08e5-baae-4347-8c0e-109a222d35de-metrics-client-ca\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.407575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.407548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.407771 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.407751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.408063 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.408045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.418894 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.418863 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb8ld\" (UniqueName: \"kubernetes.io/projected/8d37dbf7-4750-4106-9017-8187fc45ab69-kube-api-access-wb8ld\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.419407 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.419376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk7nr\" (UniqueName: \"kubernetes.io/projected/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-api-access-wk7nr\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.422525 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.419952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jhjv\" (UniqueName: \"kubernetes.io/projected/07af08e5-baae-4347-8c0e-109a222d35de-kube-api-access-7jhjv\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.909910 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.909864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:30.910089 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.909931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.910089 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.910006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-tls\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.912788 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.912760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07af08e5-baae-4347-8c0e-109a222d35de-node-exporter-tls\") pod \"node-exporter-hjrlm\" (UID: \"07af08e5-baae-4347-8c0e-109a222d35de\") " pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:30.912936 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.912836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c68dd5e-6d55-44da-b104-a82c798b9b6f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qqsrj\" (UID: \"3c68dd5e-6d55-44da-b104-a82c798b9b6f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:30.913283 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:30.913264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d37dbf7-4750-4106-9017-8187fc45ab69-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-tr5s8\" (UID: \"8d37dbf7-4750-4106-9017-8187fc45ab69\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:31.037836 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.037801 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" Apr 22 18:39:31.047558 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.047536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" Apr 22 18:39:31.086115 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.086085 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hjrlm" Apr 22 18:39:31.204020 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.203944 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:39:31.209622 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.209596 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.212195 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.211959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:39:31.212195 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.211985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:39:31.212195 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.211997 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:39:31.212195 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.212070 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:39:31.212445 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.212296 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:39:31.212445 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.212340 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:39:31.212538 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.212469 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:39:31.212607 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.212581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:39:31.212732 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.212654 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hgh2v\"" Apr 22 18:39:31.212732 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.212683 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:39:31.219658 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.219635 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:39:31.313090 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-config-volume\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313139 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313421 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313421 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313421 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nl2\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-kube-api-access-n7nl2\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313542 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313542 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-config-out\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313542 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313500 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313542 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313526 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.313703 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.313556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-web-config\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.414626 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.414833 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nl2\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-kube-api-access-n7nl2\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.414833 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.414833 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414761 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-config-out\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.414833 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.415073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.415073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-web-config\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.415073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.415073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-config-volume\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.415073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.414991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.415073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.415020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.415073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.415068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.415421 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.415134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.416809 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.415508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.416809 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.416216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.416809 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.416755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.417924 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.417896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-config-out\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.418043 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.417971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.418323 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.418281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.418747 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.418707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.418884 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.418862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-web-config\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.418958 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.418895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-config-volume\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.419896 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.419864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.419989 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.419946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.420873 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.420819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.423667 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.423647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nl2\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-kube-api-access-n7nl2\") pod \"alertmanager-main-0\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:31.522926 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:31.522850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:39:33.203003 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:33.199880 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07af08e5_baae_4347_8c0e_109a222d35de.slice/crio-624e3a26a780e4b7c62e8328fa86a63e7b2b78cfa5e482f14df0cc43acd096f4 WatchSource:0}: Error finding container 624e3a26a780e4b7c62e8328fa86a63e7b2b78cfa5e482f14df0cc43acd096f4: Status 404 returned error can't find the container with id 624e3a26a780e4b7c62e8328fa86a63e7b2b78cfa5e482f14df0cc43acd096f4 Apr 22 18:39:33.336416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:33.336393 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8"] Apr 22 18:39:33.349426 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:33.349399 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d37dbf7_4750_4106_9017_8187fc45ab69.slice/crio-6845abbbcf7ffc3ba8e39fbda4ae170984dc7b0880d0bac0c311efed12199075 WatchSource:0}: Error finding container 6845abbbcf7ffc3ba8e39fbda4ae170984dc7b0880d0bac0c311efed12199075: Status 404 returned error can't find the container with id 6845abbbcf7ffc3ba8e39fbda4ae170984dc7b0880d0bac0c311efed12199075 Apr 22 18:39:33.350827 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:33.350793 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjrlm" event={"ID":"07af08e5-baae-4347-8c0e-109a222d35de","Type":"ContainerStarted","Data":"624e3a26a780e4b7c62e8328fa86a63e7b2b78cfa5e482f14df0cc43acd096f4"} Apr 22 18:39:33.352284 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:33.352262 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qqsrj"] Apr 22 18:39:33.355670 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:33.355644 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c68dd5e_6d55_44da_b104_a82c798b9b6f.slice/crio-a353cdd5e11de574bd074b5d30e489ee8cf11c5224167f87edc42789baedb303 WatchSource:0}: Error finding container a353cdd5e11de574bd074b5d30e489ee8cf11c5224167f87edc42789baedb303: Status 404 returned error can't find the container with id a353cdd5e11de574bd074b5d30e489ee8cf11c5224167f87edc42789baedb303 Apr 22 18:39:33.379561 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:33.379521 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:39:33.381978 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:33.381955 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9553449b_f89a_41d4_bd95_0169b3d3312b.slice/crio-9994610ab20fe1d117cc3d16364bb31d4cf565b00b35358af1e2b852f35d7aa5 WatchSource:0}: Error finding container 9994610ab20fe1d117cc3d16364bb31d4cf565b00b35358af1e2b852f35d7aa5: Status 404 returned error can't find the container with id 9994610ab20fe1d117cc3d16364bb31d4cf565b00b35358af1e2b852f35d7aa5 Apr 22 18:39:34.319006 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.318073 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-khmjv" Apr 22 18:39:34.356981 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.356924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerStarted","Data":"9994610ab20fe1d117cc3d16364bb31d4cf565b00b35358af1e2b852f35d7aa5"} Apr 22 18:39:34.358574 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.358520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" event={"ID":"3c68dd5e-6d55-44da-b104-a82c798b9b6f","Type":"ContainerStarted","Data":"a353cdd5e11de574bd074b5d30e489ee8cf11c5224167f87edc42789baedb303"} Apr 22 18:39:34.361110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.360997 2577 generic.go:358] "Generic (PLEG): container finished" podID="07af08e5-baae-4347-8c0e-109a222d35de" containerID="61d09ed1a67d81d801ba42af7259bb5f26d82bc5bfffcd015027f223585d708c" exitCode=0 Apr 22 18:39:34.361110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.361080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjrlm" event={"ID":"07af08e5-baae-4347-8c0e-109a222d35de","Type":"ContainerDied","Data":"61d09ed1a67d81d801ba42af7259bb5f26d82bc5bfffcd015027f223585d708c"} Apr 22 18:39:34.362900 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.362874 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-l4mxm" event={"ID":"82fb16f8-4657-404e-b89a-3a3762417871","Type":"ContainerStarted","Data":"36d1cd2eb5ebca63781de2b9db0205a556a7ed6cd68b9f515d95471558a77839"} Apr 22 18:39:34.363255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.363226 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-l4mxm" Apr 22 18:39:34.366068 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.365977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" event={"ID":"8d37dbf7-4750-4106-9017-8187fc45ab69","Type":"ContainerStarted","Data":"3236a6c7352065bc50bd308ad644f56dc2746be691665870131d392869f3da03"} Apr 22 18:39:34.366068 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.366008 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" event={"ID":"8d37dbf7-4750-4106-9017-8187fc45ab69","Type":"ContainerStarted","Data":"85196f04fccfe96892ced69fc69a6d00e522722ac640a106fee4b967c96f9b09"} Apr 22 18:39:34.366068 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.366023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" event={"ID":"8d37dbf7-4750-4106-9017-8187fc45ab69","Type":"ContainerStarted","Data":"6845abbbcf7ffc3ba8e39fbda4ae170984dc7b0880d0bac0c311efed12199075"} Apr 22 18:39:34.371413 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.371372 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-l4mxm" Apr 22 18:39:34.398020 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.397955 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-l4mxm" podStartSLOduration=2.257622698 podStartE2EDuration="18.397937851s" podCreationTimestamp="2026-04-22 18:39:16 +0000 UTC" firstStartedPulling="2026-04-22 18:39:17.167553603 +0000 UTC m=+92.806078708" lastFinishedPulling="2026-04-22 18:39:33.307868761 +0000 UTC m=+108.946393861" observedRunningTime="2026-04-22 18:39:34.39690024 +0000 UTC m=+110.035425367" watchObservedRunningTime="2026-04-22 18:39:34.397937851 +0000 UTC m=+110.036462974" Apr 22 18:39:34.551536 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.550216 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6"] Apr 22 18:39:34.553820 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.553758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.557276 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.556969 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:39:34.557276 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.556991 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-f55tx\"" Apr 22 18:39:34.558029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.557842 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:39:34.558029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.557890 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-f0gn3bqapf22c\"" Apr 22 18:39:34.558029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.557982 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:39:34.558616 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.558276 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:39:34.566411 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.566375 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6"] Apr 22 18:39:34.644653 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.644564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-secret-metrics-server-client-certs\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.644879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.644696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f90f953-62dc-48c5-ac04-3780fa1d00ba-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.644879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.644770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8f90f953-62dc-48c5-ac04-3780fa1d00ba-metrics-server-audit-profiles\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.644879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.644807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-client-ca-bundle\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.644879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.644854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-secret-metrics-server-tls\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.645116 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.645018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w49mb\" (UniqueName: \"kubernetes.io/projected/8f90f953-62dc-48c5-ac04-3780fa1d00ba-kube-api-access-w49mb\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.645116 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.645057 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8f90f953-62dc-48c5-ac04-3780fa1d00ba-audit-log\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.746511 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.746474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-secret-metrics-server-client-certs\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.746776 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.746556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f90f953-62dc-48c5-ac04-3780fa1d00ba-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.746776 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.746596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8f90f953-62dc-48c5-ac04-3780fa1d00ba-metrics-server-audit-profiles\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.746776 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.746630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-client-ca-bundle\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.747709 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.747157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-secret-metrics-server-tls\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.747709 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.747274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w49mb\" (UniqueName: \"kubernetes.io/projected/8f90f953-62dc-48c5-ac04-3780fa1d00ba-kube-api-access-w49mb\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.747709 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.747310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8f90f953-62dc-48c5-ac04-3780fa1d00ba-audit-log\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.747709 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.747666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8f90f953-62dc-48c5-ac04-3780fa1d00ba-audit-log\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.749611 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.749564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f90f953-62dc-48c5-ac04-3780fa1d00ba-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.750191 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.750147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-secret-metrics-server-tls\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.750434 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.750391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8f90f953-62dc-48c5-ac04-3780fa1d00ba-metrics-server-audit-profiles\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.750766 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.750702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-secret-metrics-server-client-certs\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.751085 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.751065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f90f953-62dc-48c5-ac04-3780fa1d00ba-client-ca-bundle\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.755351 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.755311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w49mb\" (UniqueName: \"kubernetes.io/projected/8f90f953-62dc-48c5-ac04-3780fa1d00ba-kube-api-access-w49mb\") pod \"metrics-server-6c6cfdd7fb-nnqg6\" (UID: \"8f90f953-62dc-48c5-ac04-3780fa1d00ba\") " pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:34.869129 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:34.869097 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:35.344258 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.344220 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-767895b6fd-9qkcg"] Apr 22 18:39:35.349016 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.348994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.351126 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.351099 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:39:35.351239 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.351164 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:39:35.351481 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.351459 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:39:35.351784 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.351680 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:39:35.351784 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.351755 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-d8bhh\"" Apr 22 18:39:35.352009 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.351963 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:39:35.357812 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.357785 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:39:35.358287 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.358263 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-767895b6fd-9qkcg"] Apr 22 18:39:35.453775 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.453747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qw9h\" (UniqueName: \"kubernetes.io/projected/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-kube-api-access-4qw9h\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.454601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.453797 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-federate-client-tls\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.454601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.453934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-telemeter-client-tls\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.454601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.454069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-secret-telemeter-client\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.454601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.454145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.454601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.454229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-serving-certs-ca-bundle\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.454601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.454309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-metrics-client-ca\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.454601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.454398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-telemeter-trusted-ca-bundle\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.547024 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.546961 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6"] Apr 22 18:39:35.555089 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.555014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-secret-telemeter-client\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.555089 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.555061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.555267 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.555102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-serving-certs-ca-bundle\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.555267 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.555128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-metrics-client-ca\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.555267 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.555165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-telemeter-trusted-ca-bundle\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.555267 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.555207 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qw9h\" (UniqueName: \"kubernetes.io/projected/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-kube-api-access-4qw9h\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.555267 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.555237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-federate-client-tls\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.555520 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.555298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-telemeter-client-tls\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.558679 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:35.558481 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f90f953_62dc_48c5_ac04_3780fa1d00ba.slice/crio-78dd7fbdd6f470e4d05ca22355f8ef5763cc1210c0eded7e705a536dc4724018 WatchSource:0}: Error finding container 78dd7fbdd6f470e4d05ca22355f8ef5763cc1210c0eded7e705a536dc4724018: Status 404 returned error can't find the container with id 78dd7fbdd6f470e4d05ca22355f8ef5763cc1210c0eded7e705a536dc4724018 Apr 22 18:39:35.562865 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.562813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-telemeter-trusted-ca-bundle\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.563285 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.563243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-metrics-client-ca\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.565476 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.565380 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.566987 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.566929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-telemeter-client-tls\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.567858 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.567797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qw9h\" (UniqueName: \"kubernetes.io/projected/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-kube-api-access-4qw9h\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.568597 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.568553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-secret-telemeter-client\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.569081 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.569023 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-federate-client-tls\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.572245 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.572195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e29b4ae-9071-47ff-8090-f8ab3c12bd28-serving-certs-ca-bundle\") pod \"telemeter-client-767895b6fd-9qkcg\" (UID: \"7e29b4ae-9071-47ff-8090-f8ab3c12bd28\") " pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.663240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.663071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" Apr 22 18:39:35.836114 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:35.836086 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-767895b6fd-9qkcg"] Apr 22 18:39:35.841444 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:35.841410 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e29b4ae_9071_47ff_8090_f8ab3c12bd28.slice/crio-6e48f9fdca0863b2497968525559c613fff587b185e660cd9cd36cddb83632a3 WatchSource:0}: Error finding container 6e48f9fdca0863b2497968525559c613fff587b185e660cd9cd36cddb83632a3: Status 404 returned error can't find the container with id 6e48f9fdca0863b2497968525559c613fff587b185e660cd9cd36cddb83632a3 Apr 22 18:39:36.203007 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.202964 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fbf6d666b-dbc8b"] Apr 22 18:39:36.225315 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.225283 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fbf6d666b-dbc8b"] Apr 22 18:39:36.225538 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.225426 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.228276 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.227743 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:39:36.228983 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.227781 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:39:36.228983 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.227800 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:39:36.228983 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.227814 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:39:36.228983 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.227923 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xkjq2\"" Apr 22 18:39:36.228983 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.228020 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:39:36.233483 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.233460 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:39:36.263196 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.263167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-service-ca\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.263327 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.263215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-serving-cert\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.263437 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.263410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j2r4\" (UniqueName: \"kubernetes.io/projected/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-kube-api-access-8j2r4\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.263540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.263459 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-config\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.263540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.263510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-oauth-config\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.263637 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.263555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-trusted-ca-bundle\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.263637 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.263581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-oauth-serving-cert\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.365966 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.365479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j2r4\" (UniqueName: \"kubernetes.io/projected/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-kube-api-access-8j2r4\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.365966 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.365536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-config\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.365966 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.365605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-oauth-config\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.365966 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.365633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-trusted-ca-bundle\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.365966 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.365661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-oauth-serving-cert\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.365966 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.365744 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-service-ca\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.365966 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.365775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-serving-cert\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.367691 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.367662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-config\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.369603 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.369535 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-serving-cert\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.369951 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.369888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-oauth-config\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.377078 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.377017 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjrlm" event={"ID":"07af08e5-baae-4347-8c0e-109a222d35de","Type":"ContainerStarted","Data":"10eff99dc453bd25daa94816423cc3c4e05725b9476beb64b8c0be32a8321b2f"} Apr 22 18:39:36.377078 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.377058 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjrlm" event={"ID":"07af08e5-baae-4347-8c0e-109a222d35de","Type":"ContainerStarted","Data":"0e40e56e67a0c3d1da304315d9ac336b84f6d9e5f1416520d1612c931ba97c59"} Apr 22 18:39:36.377879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.377857 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j2r4\" (UniqueName: \"kubernetes.io/projected/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-kube-api-access-8j2r4\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.378535 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.378463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-trusted-ca-bundle\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.378798 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.378778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-oauth-serving-cert\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.379298 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.379245 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-service-ca\") pod \"console-6fbf6d666b-dbc8b\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.379501 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.379386 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" event={"ID":"8f90f953-62dc-48c5-ac04-3780fa1d00ba","Type":"ContainerStarted","Data":"78dd7fbdd6f470e4d05ca22355f8ef5763cc1210c0eded7e705a536dc4724018"} Apr 22 18:39:36.389005 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.386903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" event={"ID":"8d37dbf7-4750-4106-9017-8187fc45ab69","Type":"ContainerStarted","Data":"90cfc2d2d7da4f476b32750069d2aef45eeb2c92ed0daf2c75712123e8f767ac"} Apr 22 18:39:36.389588 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.389564 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" event={"ID":"7e29b4ae-9071-47ff-8090-f8ab3c12bd28","Type":"ContainerStarted","Data":"6e48f9fdca0863b2497968525559c613fff587b185e660cd9cd36cddb83632a3"} Apr 22 18:39:36.392652 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.392199 2577 generic.go:358] "Generic (PLEG): container finished" podID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerID="b44224d77c023c22855ea4bcd3f0368296267d60f0d5a91dc91444db2b475c81" exitCode=0 Apr 22 18:39:36.392652 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.392261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerDied","Data":"b44224d77c023c22855ea4bcd3f0368296267d60f0d5a91dc91444db2b475c81"} Apr 22 18:39:36.397271 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.397240 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" event={"ID":"3c68dd5e-6d55-44da-b104-a82c798b9b6f","Type":"ContainerStarted","Data":"ec9cb6229d6d4acc72ae6c6b48522979d0ea614f31b9bbe059dc3447a615e3bb"} Apr 22 18:39:36.397271 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.397268 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" event={"ID":"3c68dd5e-6d55-44da-b104-a82c798b9b6f","Type":"ContainerStarted","Data":"84f5435d88e5e9e2991866318b128f5388da207593bfbc5cecdeb3b246d33404"} Apr 22 18:39:36.397416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.397281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" event={"ID":"3c68dd5e-6d55-44da-b104-a82c798b9b6f","Type":"ContainerStarted","Data":"bf277220a2b8dd58b265cf232d8026d4ab6e72e05083a7194f05f230d4216c7f"} Apr 22 18:39:36.401698 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.401654 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hjrlm" podStartSLOduration=5.5508126 podStartE2EDuration="6.401640662s" podCreationTimestamp="2026-04-22 18:39:30 +0000 UTC" firstStartedPulling="2026-04-22 18:39:33.202277554 +0000 UTC m=+108.840802669" lastFinishedPulling="2026-04-22 18:39:34.053105624 +0000 UTC m=+109.691630731" observedRunningTime="2026-04-22 18:39:36.400954556 +0000 UTC m=+112.039479678" watchObservedRunningTime="2026-04-22 18:39:36.401640662 +0000 UTC m=+112.040165786" Apr 22 18:39:36.426161 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.426107 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tr5s8" podStartSLOduration=4.480778264 podStartE2EDuration="6.426088738s" podCreationTimestamp="2026-04-22 18:39:30 +0000 UTC" firstStartedPulling="2026-04-22 18:39:33.45535685 +0000 UTC m=+109.093881949" lastFinishedPulling="2026-04-22 18:39:35.400667323 +0000 UTC m=+111.039192423" observedRunningTime="2026-04-22 18:39:36.423815729 +0000 UTC m=+112.062340851" watchObservedRunningTime="2026-04-22 18:39:36.426088738 +0000 UTC m=+112.064613862" Apr 22 18:39:36.451209 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.451161 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-qqsrj" podStartSLOduration=4.408573337 podStartE2EDuration="6.45114265s" podCreationTimestamp="2026-04-22 18:39:30 +0000 UTC" firstStartedPulling="2026-04-22 18:39:33.357587886 +0000 UTC m=+108.996112985" lastFinishedPulling="2026-04-22 18:39:35.400157191 +0000 UTC m=+111.038682298" observedRunningTime="2026-04-22 18:39:36.449738262 +0000 UTC m=+112.088263379" watchObservedRunningTime="2026-04-22 18:39:36.45114265 +0000 UTC m=+112.089667774" Apr 22 18:39:36.539610 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.539095 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:36.703281 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:36.703246 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fbf6d666b-dbc8b"] Apr 22 18:39:36.712174 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:36.712142 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc56a4a_7fad_4eb9_b2b8_835b62a5b8d2.slice/crio-dd2b5c1d7d7a9041e45506c1dd1eeeb57b68ca2061d2882b546448a52bd9a00d WatchSource:0}: Error finding container dd2b5c1d7d7a9041e45506c1dd1eeeb57b68ca2061d2882b546448a52bd9a00d: Status 404 returned error can't find the container with id dd2b5c1d7d7a9041e45506c1dd1eeeb57b68ca2061d2882b546448a52bd9a00d Apr 22 18:39:37.400433 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:37.400391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbf6d666b-dbc8b" event={"ID":"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2","Type":"ContainerStarted","Data":"dd2b5c1d7d7a9041e45506c1dd1eeeb57b68ca2061d2882b546448a52bd9a00d"} Apr 22 18:39:38.288483 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:38.288451 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7fd45d456b-kdh4x" Apr 22 18:39:39.410555 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:39.410506 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" event={"ID":"8f90f953-62dc-48c5-ac04-3780fa1d00ba","Type":"ContainerStarted","Data":"bcff4ee52483908ee8636c0b5dada1b62b9744424bad1fd827488c9fe65c1ce5"} Apr 22 18:39:39.413142 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:39.413087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" event={"ID":"7e29b4ae-9071-47ff-8090-f8ab3c12bd28","Type":"ContainerStarted","Data":"fabf74fe6cef65ce0ccd93ba18f091931d0ae4be8c2e20fd3f728ac03b1fc4a9"} Apr 22 18:39:39.413142 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:39.413117 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" event={"ID":"7e29b4ae-9071-47ff-8090-f8ab3c12bd28","Type":"ContainerStarted","Data":"824abbc2e66cc590dbd9b3ec76be36d13175d085089ce630340b1158a53a323e"} Apr 22 18:39:39.429089 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:39.428116 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" podStartSLOduration=2.582692333 podStartE2EDuration="5.428099655s" podCreationTimestamp="2026-04-22 18:39:34 +0000 UTC" firstStartedPulling="2026-04-22 18:39:35.561286986 +0000 UTC m=+111.199812091" lastFinishedPulling="2026-04-22 18:39:38.406694301 +0000 UTC m=+114.045219413" observedRunningTime="2026-04-22 18:39:39.426912303 +0000 UTC m=+115.065437426" watchObservedRunningTime="2026-04-22 18:39:39.428099655 +0000 UTC m=+115.066624778" Apr 22 18:39:40.419072 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:40.419023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" event={"ID":"7e29b4ae-9071-47ff-8090-f8ab3c12bd28","Type":"ContainerStarted","Data":"d8b976ab994146c7603526c56cebbbe42e4e8642dfe8bc428303e0e980385671"} Apr 22 18:39:40.443975 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:40.443912 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-767895b6fd-9qkcg" podStartSLOduration=2.874175118 podStartE2EDuration="5.443891661s" podCreationTimestamp="2026-04-22 18:39:35 +0000 UTC" firstStartedPulling="2026-04-22 18:39:35.843913983 +0000 UTC m=+111.482439083" lastFinishedPulling="2026-04-22 18:39:38.413630523 +0000 UTC m=+114.052155626" observedRunningTime="2026-04-22 18:39:40.441653351 +0000 UTC m=+116.080178484" watchObservedRunningTime="2026-04-22 18:39:40.443891661 +0000 UTC m=+116.082416783" Apr 22 18:39:41.212107 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:41.212075 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fbf6d666b-dbc8b"] Apr 22 18:39:41.424903 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:41.424877 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerStarted","Data":"13bae3296e1acdd637b1e610cd4c4304d269873170734024e679d8cc6f6c070b"} Apr 22 18:39:41.425260 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:41.424912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerStarted","Data":"a86facaca1707b058b71e76098edc503b51f90400f55b5ebd12e4952c366b1cc"} Apr 22 18:39:42.431691 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:42.431649 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerStarted","Data":"52471e1c36bc584a2e4477787922171df35371a874747e2f4e79300395aaf800"} Apr 22 18:39:42.431691 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:42.431695 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerStarted","Data":"107982d611a0680f906935ce85bb2946f1a24acd61a238763ba29aa68bcbccb5"} Apr 22 18:39:42.432208 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:42.431710 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerStarted","Data":"34b4700af489ca1e69ace037f3195a3ff825319c02eeab983ff22169cb475569"} Apr 22 18:39:42.433918 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:42.433591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbf6d666b-dbc8b" event={"ID":"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2","Type":"ContainerStarted","Data":"acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6"} Apr 22 18:39:42.452105 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:42.452056 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fbf6d666b-dbc8b" podStartSLOduration=1.797793609 podStartE2EDuration="6.452042781s" podCreationTimestamp="2026-04-22 18:39:36 +0000 UTC" firstStartedPulling="2026-04-22 18:39:36.717401156 +0000 UTC m=+112.355926255" lastFinishedPulling="2026-04-22 18:39:41.371650323 +0000 UTC m=+117.010175427" observedRunningTime="2026-04-22 18:39:42.45046715 +0000 UTC m=+118.088992272" watchObservedRunningTime="2026-04-22 18:39:42.452042781 +0000 UTC m=+118.090567903" Apr 22 18:39:43.440736 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:43.440637 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerStarted","Data":"583f07e8c1ed4e68de6ea405a99f7b1107201f763e78a3e4e2ca9bd10f820e7e"} Apr 22 18:39:43.467491 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:43.467414 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.673406331 podStartE2EDuration="12.467397099s" podCreationTimestamp="2026-04-22 18:39:31 +0000 UTC" firstStartedPulling="2026-04-22 18:39:33.38365173 +0000 UTC m=+109.022176829" lastFinishedPulling="2026-04-22 18:39:43.177642495 +0000 UTC m=+118.816167597" observedRunningTime="2026-04-22 18:39:43.466822677 +0000 UTC m=+119.105347800" watchObservedRunningTime="2026-04-22 18:39:43.467397099 +0000 UTC m=+119.105922223" Apr 22 18:39:46.539666 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:46.539623 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:39:53.630912 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:53.630880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:39:53.633271 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:53.633247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3bffc40-492a-471a-83d2-c9bd203d82a8-metrics-certs\") pod \"network-metrics-daemon-cqf5t\" (UID: \"e3bffc40-492a-471a-83d2-c9bd203d82a8\") " pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:39:53.905351 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:53.905324 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-stzzq\"" Apr 22 18:39:53.913243 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:53.913219 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqf5t" Apr 22 18:39:54.049459 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:54.049432 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cqf5t"] Apr 22 18:39:54.051897 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:39:54.051873 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3bffc40_492a_471a_83d2_c9bd203d82a8.slice/crio-a73e04484846b6ea28a8c5c7f5d849ab830be64b33a035930dad00a5e920979e WatchSource:0}: Error finding container a73e04484846b6ea28a8c5c7f5d849ab830be64b33a035930dad00a5e920979e: Status 404 returned error can't find the container with id a73e04484846b6ea28a8c5c7f5d849ab830be64b33a035930dad00a5e920979e Apr 22 18:39:54.471616 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:54.471576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cqf5t" event={"ID":"e3bffc40-492a-471a-83d2-c9bd203d82a8","Type":"ContainerStarted","Data":"a73e04484846b6ea28a8c5c7f5d849ab830be64b33a035930dad00a5e920979e"} Apr 22 18:39:54.869330 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:54.869252 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:54.869330 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:54.869297 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:39:56.478584 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:56.478500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cqf5t" event={"ID":"e3bffc40-492a-471a-83d2-c9bd203d82a8","Type":"ContainerStarted","Data":"ba9e31e7c220b08b26afa3d455a3d457c84804ceebf68560ca00c27c918d1420"} Apr 22 18:39:56.478584 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:56.478533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cqf5t" event={"ID":"e3bffc40-492a-471a-83d2-c9bd203d82a8","Type":"ContainerStarted","Data":"b535bbc5f7173ee08788b9116fd446d8f1509f637221acc196df73b91e14d1c6"} Apr 22 18:39:56.499026 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:39:56.495597 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cqf5t" podStartSLOduration=129.647431781 podStartE2EDuration="2m11.49557957s" podCreationTimestamp="2026-04-22 18:37:45 +0000 UTC" firstStartedPulling="2026-04-22 18:39:54.053862412 +0000 UTC m=+129.692387517" lastFinishedPulling="2026-04-22 18:39:55.902010206 +0000 UTC m=+131.540535306" observedRunningTime="2026-04-22 18:39:56.493470235 +0000 UTC m=+132.131995356" watchObservedRunningTime="2026-04-22 18:39:56.49557957 +0000 UTC m=+132.134104691" Apr 22 18:40:07.456444 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.456357 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6fbf6d666b-dbc8b" podUID="9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" containerName="console" containerID="cri-o://acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6" gracePeriod=15 Apr 22 18:40:07.514503 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.514476 2577 generic.go:358] "Generic (PLEG): container finished" podID="c98406ec-5c79-4a8a-b6e6-80a29ff6a160" containerID="7fd1abc4f6012db6b167f08fbce837180140fb1f699c4788bc2beabd5187ed94" exitCode=0 Apr 22 18:40:07.514599 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.514522 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" event={"ID":"c98406ec-5c79-4a8a-b6e6-80a29ff6a160","Type":"ContainerDied","Data":"7fd1abc4f6012db6b167f08fbce837180140fb1f699c4788bc2beabd5187ed94"} Apr 22 18:40:07.514831 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.514810 2577 scope.go:117] "RemoveContainer" containerID="7fd1abc4f6012db6b167f08fbce837180140fb1f699c4788bc2beabd5187ed94" Apr 22 18:40:07.720298 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.720277 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fbf6d666b-dbc8b_9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2/console/0.log" Apr 22 18:40:07.720402 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.720345 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:40:07.846477 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846444 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-service-ca\") pod \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " Apr 22 18:40:07.846477 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846478 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-config\") pod \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " Apr 22 18:40:07.846656 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846518 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j2r4\" (UniqueName: \"kubernetes.io/projected/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-kube-api-access-8j2r4\") pod \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " Apr 22 18:40:07.846656 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846623 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-serving-cert\") pod \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " Apr 22 18:40:07.846793 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846650 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-trusted-ca-bundle\") pod \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " Apr 22 18:40:07.846793 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846778 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-oauth-config\") pod \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " Apr 22 18:40:07.846894 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846827 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-oauth-serving-cert\") pod \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\" (UID: \"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2\") " Apr 22 18:40:07.846942 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846899 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-config" (OuterVolumeSpecName: "console-config") pod "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" (UID: "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:07.846993 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.846941 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-service-ca" (OuterVolumeSpecName: "service-ca") pod "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" (UID: "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:07.847066 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.847033 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" (UID: "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:07.847192 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.847171 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-service-ca\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:07.847274 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.847197 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:07.847274 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.847207 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-trusted-ca-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:07.847274 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.847260 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" (UID: "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:07.848956 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.848932 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" (UID: "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:07.848956 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.848947 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" (UID: "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:07.849084 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.848932 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-kube-api-access-8j2r4" (OuterVolumeSpecName: "kube-api-access-8j2r4") pod "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" (UID: "9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2"). InnerVolumeSpecName "kube-api-access-8j2r4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:07.947526 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.947498 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8j2r4\" (UniqueName: \"kubernetes.io/projected/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-kube-api-access-8j2r4\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:07.947526 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.947524 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:07.947667 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.947538 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-console-oauth-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:07.947667 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:07.947552 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2-oauth-serving-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.518017 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.517949 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fbf6d666b-dbc8b_9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2/console/0.log" Apr 22 18:40:08.518017 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.517989 2577 generic.go:358] "Generic (PLEG): container finished" podID="9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" containerID="acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6" exitCode=2 Apr 22 18:40:08.518487 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.518058 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbf6d666b-dbc8b" Apr 22 18:40:08.518487 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.518072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbf6d666b-dbc8b" event={"ID":"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2","Type":"ContainerDied","Data":"acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6"} Apr 22 18:40:08.518487 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.518110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbf6d666b-dbc8b" event={"ID":"9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2","Type":"ContainerDied","Data":"dd2b5c1d7d7a9041e45506c1dd1eeeb57b68ca2061d2882b546448a52bd9a00d"} Apr 22 18:40:08.518487 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.518128 2577 scope.go:117] "RemoveContainer" containerID="acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6" Apr 22 18:40:08.520014 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.519967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2g98f" event={"ID":"c98406ec-5c79-4a8a-b6e6-80a29ff6a160","Type":"ContainerStarted","Data":"cea03416e569a8f0b9a22abf142842c05c3544237c4f0a2f62a74049d95b28fc"} Apr 22 18:40:08.540052 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.540032 2577 scope.go:117] "RemoveContainer" containerID="acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6" Apr 22 18:40:08.540325 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:40:08.540287 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6\": container with ID starting with acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6 not found: ID does not exist" containerID="acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6" Apr 22 18:40:08.540414 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.540335 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6"} err="failed to get container status \"acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6\": rpc error: code = NotFound desc = could not find container \"acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6\": container with ID starting with acb2baad11e738c609d20c2204a0d983d893493470a3c9ad18c1fd115499e5c6 not found: ID does not exist" Apr 22 18:40:08.557467 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.557443 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fbf6d666b-dbc8b"] Apr 22 18:40:08.563361 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.563336 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fbf6d666b-dbc8b"] Apr 22 18:40:08.884158 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:08.884081 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" path="/var/lib/kubelet/pods/9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2/volumes" Apr 22 18:40:14.876335 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:14.876301 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:40:14.884645 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:14.884622 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6c6cfdd7fb-nnqg6" Apr 22 18:40:22.561208 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:22.561173 2577 generic.go:358] "Generic (PLEG): container finished" podID="ffeaef2c-e524-4aff-b6b1-3a7e61159f09" containerID="937cce3cd8637edb766c2083acaaf20938f1784a9b5057421bb9b4e478098cce" exitCode=0 Apr 22 18:40:22.561626 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:22.561233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gsnxt" event={"ID":"ffeaef2c-e524-4aff-b6b1-3a7e61159f09","Type":"ContainerDied","Data":"937cce3cd8637edb766c2083acaaf20938f1784a9b5057421bb9b4e478098cce"} Apr 22 18:40:22.561626 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:22.561601 2577 scope.go:117] "RemoveContainer" containerID="937cce3cd8637edb766c2083acaaf20938f1784a9b5057421bb9b4e478098cce" Apr 22 18:40:23.565612 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:23.565578 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gsnxt" event={"ID":"ffeaef2c-e524-4aff-b6b1-3a7e61159f09","Type":"ContainerStarted","Data":"b87ed38c724bd8135b136e9176a5e2e1b16385701f63651a446e9ef6d8a80113"} Apr 22 18:40:50.510415 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.510373 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:50.511349 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.511292 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="alertmanager" containerID="cri-o://a86facaca1707b058b71e76098edc503b51f90400f55b5ebd12e4952c366b1cc" gracePeriod=120 Apr 22 18:40:50.511699 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.511667 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy-web" containerID="cri-o://34b4700af489ca1e69ace037f3195a3ff825319c02eeab983ff22169cb475569" gracePeriod=120 Apr 22 18:40:50.511824 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.511668 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy" containerID="cri-o://107982d611a0680f906935ce85bb2946f1a24acd61a238763ba29aa68bcbccb5" gracePeriod=120 Apr 22 18:40:50.512231 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.511856 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="prom-label-proxy" containerID="cri-o://583f07e8c1ed4e68de6ea405a99f7b1107201f763e78a3e4e2ca9bd10f820e7e" gracePeriod=120 Apr 22 18:40:50.512231 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.511864 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="config-reloader" containerID="cri-o://13bae3296e1acdd637b1e610cd4c4304d269873170734024e679d8cc6f6c070b" gracePeriod=120 Apr 22 18:40:50.512231 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.511965 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy-metric" containerID="cri-o://52471e1c36bc584a2e4477787922171df35371a874747e2f4e79300395aaf800" gracePeriod=120 Apr 22 18:40:50.646821 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646795 2577 generic.go:358] "Generic (PLEG): container finished" podID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerID="583f07e8c1ed4e68de6ea405a99f7b1107201f763e78a3e4e2ca9bd10f820e7e" exitCode=0 Apr 22 18:40:50.646821 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646819 2577 generic.go:358] "Generic (PLEG): container finished" podID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerID="52471e1c36bc584a2e4477787922171df35371a874747e2f4e79300395aaf800" exitCode=0 Apr 22 18:40:50.646964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646827 2577 generic.go:358] "Generic (PLEG): container finished" podID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerID="107982d611a0680f906935ce85bb2946f1a24acd61a238763ba29aa68bcbccb5" exitCode=0 Apr 22 18:40:50.646964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646834 2577 generic.go:358] "Generic (PLEG): container finished" podID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerID="13bae3296e1acdd637b1e610cd4c4304d269873170734024e679d8cc6f6c070b" exitCode=0 Apr 22 18:40:50.646964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646841 2577 generic.go:358] "Generic (PLEG): container finished" podID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerID="a86facaca1707b058b71e76098edc503b51f90400f55b5ebd12e4952c366b1cc" exitCode=0 Apr 22 18:40:50.646964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerDied","Data":"583f07e8c1ed4e68de6ea405a99f7b1107201f763e78a3e4e2ca9bd10f820e7e"} Apr 22 18:40:50.646964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646916 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerDied","Data":"52471e1c36bc584a2e4477787922171df35371a874747e2f4e79300395aaf800"} Apr 22 18:40:50.646964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerDied","Data":"107982d611a0680f906935ce85bb2946f1a24acd61a238763ba29aa68bcbccb5"} Apr 22 18:40:50.646964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerDied","Data":"13bae3296e1acdd637b1e610cd4c4304d269873170734024e679d8cc6f6c070b"} Apr 22 18:40:50.646964 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:50.646944 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerDied","Data":"a86facaca1707b058b71e76098edc503b51f90400f55b5ebd12e4952c366b1cc"} Apr 22 18:40:51.653604 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.653570 2577 generic.go:358] "Generic (PLEG): container finished" podID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerID="34b4700af489ca1e69ace037f3195a3ff825319c02eeab983ff22169cb475569" exitCode=0 Apr 22 18:40:51.653927 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.653616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerDied","Data":"34b4700af489ca1e69ace037f3195a3ff825319c02eeab983ff22169cb475569"} Apr 22 18:40:51.760941 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.760909 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:51.904457 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904425 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-metrics-client-ca\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.904629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904472 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-config-volume\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.904629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904496 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-main-tls\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.904629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904518 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-config-out\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.904629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904556 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-cluster-tls-config\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.904629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904581 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7nl2\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-kube-api-access-n7nl2\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.904629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904607 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-main-db\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.905035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904641 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-tls-assets\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.905035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904671 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.905035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904704 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-web-config\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.905035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904761 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.905035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904813 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-web\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.905035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904852 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-trusted-ca-bundle\") pod \"9553449b-f89a-41d4-bd95-0169b3d3312b\" (UID: \"9553449b-f89a-41d4-bd95-0169b3d3312b\") " Apr 22 18:40:51.905035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.904918 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:51.905035 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.905007 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:51.905432 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.905200 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-metrics-client-ca\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:51.905432 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.905222 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-main-db\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:51.907179 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.907146 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-config-volume" (OuterVolumeSpecName: "config-volume") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:51.907687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.907582 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:51.907687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.907612 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-config-out" (OuterVolumeSpecName: "config-out") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:51.907687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.907623 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:51.907687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.907651 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:51.908027 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.908001 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:51.908374 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.908352 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:51.908572 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.908545 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-kube-api-access-n7nl2" (OuterVolumeSpecName: "kube-api-access-n7nl2") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "kube-api-access-n7nl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:51.909882 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.909864 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:51.912163 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.912140 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:51.918477 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:51.918454 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-web-config" (OuterVolumeSpecName: "web-config") pod "9553449b-f89a-41d4-bd95-0169b3d3312b" (UID: "9553449b-f89a-41d4-bd95-0169b3d3312b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:52.006187 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006159 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-config-volume\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006187 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006185 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-main-tls\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006195 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9553449b-f89a-41d4-bd95-0169b3d3312b-config-out\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006204 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-cluster-tls-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006213 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7nl2\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-kube-api-access-n7nl2\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006222 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9553449b-f89a-41d4-bd95-0169b3d3312b-tls-assets\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006231 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006241 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-web-config\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006250 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006259 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9553449b-f89a-41d4-bd95-0169b3d3312b-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.006304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.006270 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9553449b-f89a-41d4-bd95-0169b3d3312b-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:40:52.659302 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.659267 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9553449b-f89a-41d4-bd95-0169b3d3312b","Type":"ContainerDied","Data":"9994610ab20fe1d117cc3d16364bb31d4cf565b00b35358af1e2b852f35d7aa5"} Apr 22 18:40:52.659680 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.659318 2577 scope.go:117] "RemoveContainer" containerID="583f07e8c1ed4e68de6ea405a99f7b1107201f763e78a3e4e2ca9bd10f820e7e" Apr 22 18:40:52.659680 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.659353 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.666514 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.666495 2577 scope.go:117] "RemoveContainer" containerID="52471e1c36bc584a2e4477787922171df35371a874747e2f4e79300395aaf800" Apr 22 18:40:52.673555 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.673538 2577 scope.go:117] "RemoveContainer" containerID="107982d611a0680f906935ce85bb2946f1a24acd61a238763ba29aa68bcbccb5" Apr 22 18:40:52.679749 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.679733 2577 scope.go:117] "RemoveContainer" containerID="34b4700af489ca1e69ace037f3195a3ff825319c02eeab983ff22169cb475569" Apr 22 18:40:52.685064 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.685041 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:52.686507 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.686491 2577 scope.go:117] "RemoveContainer" containerID="13bae3296e1acdd637b1e610cd4c4304d269873170734024e679d8cc6f6c070b" Apr 22 18:40:52.692954 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.692931 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:52.693595 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.693578 2577 scope.go:117] "RemoveContainer" containerID="a86facaca1707b058b71e76098edc503b51f90400f55b5ebd12e4952c366b1cc" Apr 22 18:40:52.699759 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.699741 2577 scope.go:117] "RemoveContainer" containerID="b44224d77c023c22855ea4bcd3f0368296267d60f0d5a91dc91444db2b475c81" Apr 22 18:40:52.723253 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723234 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:52.723497 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723485 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" containerName="console" Apr 22 18:40:52.723544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723498 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" containerName="console" Apr 22 18:40:52.723544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723506 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="init-config-reloader" Apr 22 18:40:52.723544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723513 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="init-config-reloader" Apr 22 18:40:52.723544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723521 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="config-reloader" Apr 22 18:40:52.723544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723527 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="config-reloader" Apr 22 18:40:52.723544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723536 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy-web" Apr 22 18:40:52.723544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723542 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy-web" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723552 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy-metric" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723557 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy-metric" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723563 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="prom-label-proxy" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723569 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="prom-label-proxy" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723580 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723585 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723595 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="alertmanager" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723600 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="alertmanager" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723645 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723654 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy-metric" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723660 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="alertmanager" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723668 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="config-reloader" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723674 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="prom-label-proxy" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723681 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" containerName="kube-rbac-proxy-web" Apr 22 18:40:52.723767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.723688 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9dc56a4a-7fad-4eb9-b2b8-835b62a5b8d2" containerName="console" Apr 22 18:40:52.728560 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.728544 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.731041 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731023 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:40:52.731269 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731254 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:40:52.731576 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731562 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:40:52.731576 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731570 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hgh2v\"" Apr 22 18:40:52.731725 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:40:52.731880 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731864 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:40:52.731969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731952 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:40:52.732029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731957 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:40:52.732029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.731995 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:40:52.737359 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.737342 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:40:52.740244 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.740222 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:52.813379 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76660775-fc5f-4d9a-8159-93b4eb7a52d3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813489 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4s8x\" (UniqueName: \"kubernetes.io/projected/76660775-fc5f-4d9a-8159-93b4eb7a52d3-kube-api-access-k4s8x\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813489 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813489 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76660775-fc5f-4d9a-8159-93b4eb7a52d3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813587 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76660775-fc5f-4d9a-8159-93b4eb7a52d3-config-out\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813587 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-web-config\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813650 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813650 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/76660775-fc5f-4d9a-8159-93b4eb7a52d3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813734 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813766 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813766 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76660775-fc5f-4d9a-8159-93b4eb7a52d3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813831 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-config-volume\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.813831 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.813799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.883212 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.883150 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9553449b-f89a-41d4-bd95-0169b3d3312b" path="/var/lib/kubelet/pods/9553449b-f89a-41d4-bd95-0169b3d3312b/volumes" Apr 22 18:40:52.914797 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-web-config\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.914882 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.914882 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/76660775-fc5f-4d9a-8159-93b4eb7a52d3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.914882 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.914992 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.914992 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76660775-fc5f-4d9a-8159-93b4eb7a52d3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.914992 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-config-volume\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.914992 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.914992 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.914977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76660775-fc5f-4d9a-8159-93b4eb7a52d3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.915240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.915192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/76660775-fc5f-4d9a-8159-93b4eb7a52d3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.915618 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.915600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4s8x\" (UniqueName: \"kubernetes.io/projected/76660775-fc5f-4d9a-8159-93b4eb7a52d3-kube-api-access-k4s8x\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.915803 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.915781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.915914 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.915899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76660775-fc5f-4d9a-8159-93b4eb7a52d3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.916041 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.916025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76660775-fc5f-4d9a-8159-93b4eb7a52d3-config-out\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.916344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.915647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76660775-fc5f-4d9a-8159-93b4eb7a52d3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.917047 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.917019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76660775-fc5f-4d9a-8159-93b4eb7a52d3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.918018 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.917971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.918643 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.918280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.918643 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.918389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-config-volume\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.918643 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.918584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.918643 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.918604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/76660775-fc5f-4d9a-8159-93b4eb7a52d3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.918643 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.918640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/76660775-fc5f-4d9a-8159-93b4eb7a52d3-config-out\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.918866 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.918786 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-web-config\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.919213 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.919194 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.919742 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.919701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/76660775-fc5f-4d9a-8159-93b4eb7a52d3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:52.924692 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:52.924676 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4s8x\" (UniqueName: \"kubernetes.io/projected/76660775-fc5f-4d9a-8159-93b4eb7a52d3-kube-api-access-k4s8x\") pod \"alertmanager-main-0\" (UID: \"76660775-fc5f-4d9a-8159-93b4eb7a52d3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:53.037914 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:53.037892 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:53.164601 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:53.164524 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:53.166792 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:40:53.166767 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76660775_fc5f_4d9a_8159_93b4eb7a52d3.slice/crio-838d5df49487e045fdd2474dc4d4f8ade469b8020e9b84b64484f33c72767539 WatchSource:0}: Error finding container 838d5df49487e045fdd2474dc4d4f8ade469b8020e9b84b64484f33c72767539: Status 404 returned error can't find the container with id 838d5df49487e045fdd2474dc4d4f8ade469b8020e9b84b64484f33c72767539 Apr 22 18:40:53.664027 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:53.663989 2577 generic.go:358] "Generic (PLEG): container finished" podID="76660775-fc5f-4d9a-8159-93b4eb7a52d3" containerID="fb7161a55b6d77b2b3e095c9256cefa26d757607838c95044a67d7742627051f" exitCode=0 Apr 22 18:40:53.664349 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:53.664036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"76660775-fc5f-4d9a-8159-93b4eb7a52d3","Type":"ContainerDied","Data":"fb7161a55b6d77b2b3e095c9256cefa26d757607838c95044a67d7742627051f"} Apr 22 18:40:53.664349 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:53.664059 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"76660775-fc5f-4d9a-8159-93b4eb7a52d3","Type":"ContainerStarted","Data":"838d5df49487e045fdd2474dc4d4f8ade469b8020e9b84b64484f33c72767539"} Apr 22 18:40:54.670175 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:54.670142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"76660775-fc5f-4d9a-8159-93b4eb7a52d3","Type":"ContainerStarted","Data":"ae68ea22563fda37f152d3f9b99afb82f87fe928a705895baebcf860b694f2a5"} Apr 22 18:40:54.670175 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:54.670176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"76660775-fc5f-4d9a-8159-93b4eb7a52d3","Type":"ContainerStarted","Data":"7f5dc7218dbac1ba750e7bc435314562f72c3e6cc1a7fd243359a0e176a7c50e"} Apr 22 18:40:54.670575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:54.670185 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"76660775-fc5f-4d9a-8159-93b4eb7a52d3","Type":"ContainerStarted","Data":"21ea6bf63f151edb567983458db46b5ba7ed323e38f202eb4f0b922a69d5e496"} Apr 22 18:40:54.670575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:54.670195 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"76660775-fc5f-4d9a-8159-93b4eb7a52d3","Type":"ContainerStarted","Data":"f9615ec9cf5e373f5f876ae4db9418388cc06126da9d1d4e1d2fe389da436bd6"} Apr 22 18:40:54.670575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:54.670204 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"76660775-fc5f-4d9a-8159-93b4eb7a52d3","Type":"ContainerStarted","Data":"43ee4d2fb5dd67d8e6b1c1d2249202be33ca7adc3827e480d0a95abe483d9203"} Apr 22 18:40:54.670575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:54.670212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"76660775-fc5f-4d9a-8159-93b4eb7a52d3","Type":"ContainerStarted","Data":"789e4b1bcda85f732efb8d946382a632769ad037c5d952877fe68e32830da2c9"} Apr 22 18:40:54.698502 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:40:54.698439 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.698422114 podStartE2EDuration="2.698422114s" podCreationTimestamp="2026-04-22 18:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:40:54.697798664 +0000 UTC m=+190.336323786" watchObservedRunningTime="2026-04-22 18:40:54.698422114 +0000 UTC m=+190.336947236" Apr 22 18:41:55.556261 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.555978 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q"] Apr 22 18:41:55.561023 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.561003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.563472 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.563454 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:41:55.563538 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.563455 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:41:55.563538 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.563490 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-94mmb\"" Apr 22 18:41:55.568675 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.568650 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q"] Apr 22 18:41:55.718301 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.718262 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.718472 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.718314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.718472 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.718362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrchf\" (UniqueName: \"kubernetes.io/projected/280a70e8-8a0d-4ece-9118-616be30bbebf-kube-api-access-rrchf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.819418 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.819338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.819418 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.819384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.819418 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.819421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrchf\" (UniqueName: \"kubernetes.io/projected/280a70e8-8a0d-4ece-9118-616be30bbebf-kube-api-access-rrchf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.819708 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.819689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.819871 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.819855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.828313 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.828293 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrchf\" (UniqueName: \"kubernetes.io/projected/280a70e8-8a0d-4ece-9118-616be30bbebf-kube-api-access-rrchf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.871853 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.871829 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:41:55.991934 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:55.991896 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q"] Apr 22 18:41:55.994815 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:41:55.994787 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280a70e8_8a0d_4ece_9118_616be30bbebf.slice/crio-6986b8aa9e672582792f7b297dfe9e3f4c1f1c95071e07edf4eb236acc4021f9 WatchSource:0}: Error finding container 6986b8aa9e672582792f7b297dfe9e3f4c1f1c95071e07edf4eb236acc4021f9: Status 404 returned error can't find the container with id 6986b8aa9e672582792f7b297dfe9e3f4c1f1c95071e07edf4eb236acc4021f9 Apr 22 18:41:56.848705 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:41:56.848662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" event={"ID":"280a70e8-8a0d-4ece-9118-616be30bbebf","Type":"ContainerStarted","Data":"6986b8aa9e672582792f7b297dfe9e3f4c1f1c95071e07edf4eb236acc4021f9"} Apr 22 18:42:00.863414 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:00.863384 2577 generic.go:358] "Generic (PLEG): container finished" podID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerID="eb20089add91178ec8c60c687fda541709f070fcbf914ee942031b4f8426af3b" exitCode=0 Apr 22 18:42:00.863739 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:00.863453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" event={"ID":"280a70e8-8a0d-4ece-9118-616be30bbebf","Type":"ContainerDied","Data":"eb20089add91178ec8c60c687fda541709f070fcbf914ee942031b4f8426af3b"} Apr 22 18:42:03.874013 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:03.873979 2577 generic.go:358] "Generic (PLEG): container finished" podID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerID="012fa5f6ad103a24f25834841b49eb6799c013a5ce0e28721b10dfbecdfae680" exitCode=0 Apr 22 18:42:03.874412 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:03.874061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" event={"ID":"280a70e8-8a0d-4ece-9118-616be30bbebf","Type":"ContainerDied","Data":"012fa5f6ad103a24f25834841b49eb6799c013a5ce0e28721b10dfbecdfae680"} Apr 22 18:42:09.894596 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:09.894552 2577 generic.go:358] "Generic (PLEG): container finished" podID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerID="b3bbf55c2ddc6450642f57ce5fa0ab862037264db6ba6ed03d2c0fcda3493d02" exitCode=0 Apr 22 18:42:09.894934 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:09.894644 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" event={"ID":"280a70e8-8a0d-4ece-9118-616be30bbebf","Type":"ContainerDied","Data":"b3bbf55c2ddc6450642f57ce5fa0ab862037264db6ba6ed03d2c0fcda3493d02"} Apr 22 18:42:11.012684 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.012663 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:42:11.051174 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.051147 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrchf\" (UniqueName: \"kubernetes.io/projected/280a70e8-8a0d-4ece-9118-616be30bbebf-kube-api-access-rrchf\") pod \"280a70e8-8a0d-4ece-9118-616be30bbebf\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " Apr 22 18:42:11.051305 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.051229 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-bundle\") pod \"280a70e8-8a0d-4ece-9118-616be30bbebf\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " Apr 22 18:42:11.051368 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.051305 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-util\") pod \"280a70e8-8a0d-4ece-9118-616be30bbebf\" (UID: \"280a70e8-8a0d-4ece-9118-616be30bbebf\") " Apr 22 18:42:11.051909 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.051882 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-bundle" (OuterVolumeSpecName: "bundle") pod "280a70e8-8a0d-4ece-9118-616be30bbebf" (UID: "280a70e8-8a0d-4ece-9118-616be30bbebf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:42:11.053367 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.053342 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280a70e8-8a0d-4ece-9118-616be30bbebf-kube-api-access-rrchf" (OuterVolumeSpecName: "kube-api-access-rrchf") pod "280a70e8-8a0d-4ece-9118-616be30bbebf" (UID: "280a70e8-8a0d-4ece-9118-616be30bbebf"). InnerVolumeSpecName "kube-api-access-rrchf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:42:11.056255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.056237 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-util" (OuterVolumeSpecName: "util") pod "280a70e8-8a0d-4ece-9118-616be30bbebf" (UID: "280a70e8-8a0d-4ece-9118-616be30bbebf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:42:11.152459 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.152386 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:42:11.152459 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.152413 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rrchf\" (UniqueName: \"kubernetes.io/projected/280a70e8-8a0d-4ece-9118-616be30bbebf-kube-api-access-rrchf\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:42:11.152459 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.152424 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/280a70e8-8a0d-4ece-9118-616be30bbebf-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:42:11.901542 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.901509 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" event={"ID":"280a70e8-8a0d-4ece-9118-616be30bbebf","Type":"ContainerDied","Data":"6986b8aa9e672582792f7b297dfe9e3f4c1f1c95071e07edf4eb236acc4021f9"} Apr 22 18:42:11.901542 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.901545 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6986b8aa9e672582792f7b297dfe9e3f4c1f1c95071e07edf4eb236acc4021f9" Apr 22 18:42:11.901800 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:11.901522 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c79j7q" Apr 22 18:42:22.222986 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.222953 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jgbk7"] Apr 22 18:42:22.223416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.223252 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerName="extract" Apr 22 18:42:22.223416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.223263 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerName="extract" Apr 22 18:42:22.223416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.223274 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerName="pull" Apr 22 18:42:22.223416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.223280 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerName="pull" Apr 22 18:42:22.223416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.223287 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerName="util" Apr 22 18:42:22.223416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.223293 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerName="util" Apr 22 18:42:22.223416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.223360 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="280a70e8-8a0d-4ece-9118-616be30bbebf" containerName="extract" Apr 22 18:42:22.272752 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.272684 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jgbk7"] Apr 22 18:42:22.272885 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.272845 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.276017 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.275988 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-w85zh\"" Apr 22 18:42:22.276257 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.276224 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:42:22.278821 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.278790 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:42:22.278821 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.278805 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:42:22.279021 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.278858 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:42:22.279021 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.278961 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:42:22.333956 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.333927 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.334070 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.333967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-cabundle0\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.334070 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.334040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5kl\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-kube-api-access-mv5kl\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.434865 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.434830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.435007 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.434879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-cabundle0\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.435007 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.434930 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5kl\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-kube-api-access-mv5kl\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.435007 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:22.434970 2577 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:42:22.435007 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:22.434988 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:42:22.435007 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:22.434997 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jgbk7: references non-existent secret key: ca.crt Apr 22 18:42:22.435170 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:22.435079 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates podName:6f3e223e-6c59-4d31-a1b7-d253bd5bd996 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:22.935058581 +0000 UTC m=+278.573583681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates") pod "keda-operator-ffbb595cb-jgbk7" (UID: "6f3e223e-6c59-4d31-a1b7-d253bd5bd996") : references non-existent secret key: ca.crt Apr 22 18:42:22.435512 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.435492 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-cabundle0\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.446085 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.446063 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5kl\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-kube-api-access-mv5kl\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.773894 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.773862 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-4qcj7"] Apr 22 18:42:22.803920 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.803895 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-4qcj7"] Apr 22 18:42:22.804055 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.803994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:22.806421 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.806400 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:42:22.839136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.839112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99f5d991-bfdb-4bce-a8c8-64c9063e3214-certificates\") pod \"keda-admission-cf49989db-4qcj7\" (UID: \"99f5d991-bfdb-4bce-a8c8-64c9063e3214\") " pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:22.839262 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.839152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clrx4\" (UniqueName: \"kubernetes.io/projected/99f5d991-bfdb-4bce-a8c8-64c9063e3214-kube-api-access-clrx4\") pod \"keda-admission-cf49989db-4qcj7\" (UID: \"99f5d991-bfdb-4bce-a8c8-64c9063e3214\") " pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:22.939578 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.939552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:22.939758 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.939583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99f5d991-bfdb-4bce-a8c8-64c9063e3214-certificates\") pod \"keda-admission-cf49989db-4qcj7\" (UID: \"99f5d991-bfdb-4bce-a8c8-64c9063e3214\") " pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:22.939758 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.939605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clrx4\" (UniqueName: \"kubernetes.io/projected/99f5d991-bfdb-4bce-a8c8-64c9063e3214-kube-api-access-clrx4\") pod \"keda-admission-cf49989db-4qcj7\" (UID: \"99f5d991-bfdb-4bce-a8c8-64c9063e3214\") " pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:22.939758 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:22.939696 2577 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:42:22.939758 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:22.939733 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:42:22.939758 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:22.939743 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jgbk7: references non-existent secret key: ca.crt Apr 22 18:42:22.940021 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:22.939807 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates podName:6f3e223e-6c59-4d31-a1b7-d253bd5bd996 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:23.939788231 +0000 UTC m=+279.578313333 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates") pod "keda-operator-ffbb595cb-jgbk7" (UID: "6f3e223e-6c59-4d31-a1b7-d253bd5bd996") : references non-existent secret key: ca.crt Apr 22 18:42:22.941989 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.941968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99f5d991-bfdb-4bce-a8c8-64c9063e3214-certificates\") pod \"keda-admission-cf49989db-4qcj7\" (UID: \"99f5d991-bfdb-4bce-a8c8-64c9063e3214\") " pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:22.951109 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:22.951088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clrx4\" (UniqueName: \"kubernetes.io/projected/99f5d991-bfdb-4bce-a8c8-64c9063e3214-kube-api-access-clrx4\") pod \"keda-admission-cf49989db-4qcj7\" (UID: \"99f5d991-bfdb-4bce-a8c8-64c9063e3214\") " pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:23.114673 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:23.114606 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:23.245074 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:23.245011 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-4qcj7"] Apr 22 18:42:23.247642 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:42:23.247615 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99f5d991_bfdb_4bce_a8c8_64c9063e3214.slice/crio-826cb0c8cafef722936d751e0ef5571b07fb4e57849946f07c3259854d6f0d8b WatchSource:0}: Error finding container 826cb0c8cafef722936d751e0ef5571b07fb4e57849946f07c3259854d6f0d8b: Status 404 returned error can't find the container with id 826cb0c8cafef722936d751e0ef5571b07fb4e57849946f07c3259854d6f0d8b Apr 22 18:42:23.935747 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:23.935701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-4qcj7" event={"ID":"99f5d991-bfdb-4bce-a8c8-64c9063e3214","Type":"ContainerStarted","Data":"826cb0c8cafef722936d751e0ef5571b07fb4e57849946f07c3259854d6f0d8b"} Apr 22 18:42:23.949073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:23.949044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:23.949180 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:23.949172 2577 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:42:23.949216 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:23.949184 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:42:23.949216 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:23.949193 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jgbk7: references non-existent secret key: ca.crt Apr 22 18:42:23.949281 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:23.949249 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates podName:6f3e223e-6c59-4d31-a1b7-d253bd5bd996 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:25.949232068 +0000 UTC m=+281.587757170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates") pod "keda-operator-ffbb595cb-jgbk7" (UID: "6f3e223e-6c59-4d31-a1b7-d253bd5bd996") : references non-existent secret key: ca.crt Apr 22 18:42:25.967876 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:25.967821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:25.968400 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:25.967992 2577 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:42:25.968400 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:25.968015 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:42:25.968400 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:25.968029 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jgbk7: references non-existent secret key: ca.crt Apr 22 18:42:25.968400 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:42:25.968095 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates podName:6f3e223e-6c59-4d31-a1b7-d253bd5bd996 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:29.968076224 +0000 UTC m=+285.606601340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates") pod "keda-operator-ffbb595cb-jgbk7" (UID: "6f3e223e-6c59-4d31-a1b7-d253bd5bd996") : references non-existent secret key: ca.crt Apr 22 18:42:26.945494 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:26.945460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-4qcj7" event={"ID":"99f5d991-bfdb-4bce-a8c8-64c9063e3214","Type":"ContainerStarted","Data":"b91a8dec53436d17010d57072a01b7f81c1feaf5f1b9a5f0cc33e685f3ffdf58"} Apr 22 18:42:26.945657 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:26.945578 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:26.962298 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:26.962254 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-4qcj7" podStartSLOduration=2.169911441 podStartE2EDuration="4.962240727s" podCreationTimestamp="2026-04-22 18:42:22 +0000 UTC" firstStartedPulling="2026-04-22 18:42:23.248895239 +0000 UTC m=+278.887420338" lastFinishedPulling="2026-04-22 18:42:26.041224522 +0000 UTC m=+281.679749624" observedRunningTime="2026-04-22 18:42:26.961402914 +0000 UTC m=+282.599928036" watchObservedRunningTime="2026-04-22 18:42:26.962240727 +0000 UTC m=+282.600765849" Apr 22 18:42:29.998392 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:29.998359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:30.000664 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:30.000645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f3e223e-6c59-4d31-a1b7-d253bd5bd996-certificates\") pod \"keda-operator-ffbb595cb-jgbk7\" (UID: \"6f3e223e-6c59-4d31-a1b7-d253bd5bd996\") " pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:30.086819 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:30.086791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:30.206695 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:30.206674 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jgbk7"] Apr 22 18:42:30.208894 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:42:30.208866 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3e223e_6c59_4d31_a1b7_d253bd5bd996.slice/crio-17cf6ec292041645ac112664180e5eae2c5c9fbdac1408203b7482fd7dcb1b6d WatchSource:0}: Error finding container 17cf6ec292041645ac112664180e5eae2c5c9fbdac1408203b7482fd7dcb1b6d: Status 404 returned error can't find the container with id 17cf6ec292041645ac112664180e5eae2c5c9fbdac1408203b7482fd7dcb1b6d Apr 22 18:42:30.957442 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:30.957405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" event={"ID":"6f3e223e-6c59-4d31-a1b7-d253bd5bd996","Type":"ContainerStarted","Data":"17cf6ec292041645ac112664180e5eae2c5c9fbdac1408203b7482fd7dcb1b6d"} Apr 22 18:42:36.975548 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:36.975511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" event={"ID":"6f3e223e-6c59-4d31-a1b7-d253bd5bd996","Type":"ContainerStarted","Data":"99b7627cc77b64c19d5f0f031cea0c290b9fdc9987d7d072285eccb44fbecdab"} Apr 22 18:42:36.976002 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:36.975599 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:42:36.993275 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:36.993230 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" podStartSLOduration=8.454810598 podStartE2EDuration="14.993216117s" podCreationTimestamp="2026-04-22 18:42:22 +0000 UTC" firstStartedPulling="2026-04-22 18:42:30.210653116 +0000 UTC m=+285.849178217" lastFinishedPulling="2026-04-22 18:42:36.749058631 +0000 UTC m=+292.387583736" observedRunningTime="2026-04-22 18:42:36.991130203 +0000 UTC m=+292.629655325" watchObservedRunningTime="2026-04-22 18:42:36.993216117 +0000 UTC m=+292.631741239" Apr 22 18:42:44.818427 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:44.818401 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:42:47.950380 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:47.950355 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-4qcj7" Apr 22 18:42:57.980334 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:42:57.980300 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-jgbk7" Apr 22 18:43:16.206469 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.206438 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr"] Apr 22 18:43:16.213605 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.213588 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.216372 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.216330 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-94mmb\"" Apr 22 18:43:16.216547 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.216410 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:43:16.217081 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.217035 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:43:16.219404 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.219379 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr"] Apr 22 18:43:16.261570 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.261545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6pt\" (UniqueName: \"kubernetes.io/projected/3d629c8d-ebc5-4c3d-8173-c33d27151799-kube-api-access-cz6pt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.261678 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.261582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.261678 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.261642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.362709 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.362674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6pt\" (UniqueName: \"kubernetes.io/projected/3d629c8d-ebc5-4c3d-8173-c33d27151799-kube-api-access-cz6pt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.362873 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.362734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.362873 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.362773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.363147 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.363132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.363681 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.363662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.371503 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.371482 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6pt\" (UniqueName: \"kubernetes.io/projected/3d629c8d-ebc5-4c3d-8173-c33d27151799-kube-api-access-cz6pt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.523655 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.523571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:16.640506 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.640482 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr"] Apr 22 18:43:16.642956 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:43:16.642927 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d629c8d_ebc5_4c3d_8173_c33d27151799.slice/crio-b6d168cbcba1bd92b0bc97c12ae51df7b4462996575b4678ebd05e601f0ffaff WatchSource:0}: Error finding container b6d168cbcba1bd92b0bc97c12ae51df7b4462996575b4678ebd05e601f0ffaff: Status 404 returned error can't find the container with id b6d168cbcba1bd92b0bc97c12ae51df7b4462996575b4678ebd05e601f0ffaff Apr 22 18:43:16.644887 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:16.644866 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:43:17.090953 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:17.090921 2577 generic.go:358] "Generic (PLEG): container finished" podID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerID="664468ed643ac3b6edd99936b55875c88d4eed476b7cd0bee3c5ad7f0b0b92a4" exitCode=0 Apr 22 18:43:17.091139 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:17.090967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" event={"ID":"3d629c8d-ebc5-4c3d-8173-c33d27151799","Type":"ContainerDied","Data":"664468ed643ac3b6edd99936b55875c88d4eed476b7cd0bee3c5ad7f0b0b92a4"} Apr 22 18:43:17.091139 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:17.090988 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" event={"ID":"3d629c8d-ebc5-4c3d-8173-c33d27151799","Type":"ContainerStarted","Data":"b6d168cbcba1bd92b0bc97c12ae51df7b4462996575b4678ebd05e601f0ffaff"} Apr 22 18:43:20.103678 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:20.103646 2577 generic.go:358] "Generic (PLEG): container finished" podID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerID="fc4c773938a050cda51d15b3d380578435b1661470d9656be6f1263ba1c103f4" exitCode=0 Apr 22 18:43:20.104142 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:20.103711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" event={"ID":"3d629c8d-ebc5-4c3d-8173-c33d27151799","Type":"ContainerDied","Data":"fc4c773938a050cda51d15b3d380578435b1661470d9656be6f1263ba1c103f4"} Apr 22 18:43:21.108533 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:21.108489 2577 generic.go:358] "Generic (PLEG): container finished" podID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerID="b19c9368f35ddf01e29a311fca61ceea13a1864261209ec8f3ca6341c2c3d930" exitCode=0 Apr 22 18:43:21.108916 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:21.108560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" event={"ID":"3d629c8d-ebc5-4c3d-8173-c33d27151799","Type":"ContainerDied","Data":"b19c9368f35ddf01e29a311fca61ceea13a1864261209ec8f3ca6341c2c3d930"} Apr 22 18:43:22.233686 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.233664 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:22.308544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.308517 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-bundle\") pod \"3d629c8d-ebc5-4c3d-8173-c33d27151799\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " Apr 22 18:43:22.308691 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.308556 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-util\") pod \"3d629c8d-ebc5-4c3d-8173-c33d27151799\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " Apr 22 18:43:22.308691 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.308608 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz6pt\" (UniqueName: \"kubernetes.io/projected/3d629c8d-ebc5-4c3d-8173-c33d27151799-kube-api-access-cz6pt\") pod \"3d629c8d-ebc5-4c3d-8173-c33d27151799\" (UID: \"3d629c8d-ebc5-4c3d-8173-c33d27151799\") " Apr 22 18:43:22.309205 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.309178 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-bundle" (OuterVolumeSpecName: "bundle") pod "3d629c8d-ebc5-4c3d-8173-c33d27151799" (UID: "3d629c8d-ebc5-4c3d-8173-c33d27151799"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:43:22.310834 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.310804 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d629c8d-ebc5-4c3d-8173-c33d27151799-kube-api-access-cz6pt" (OuterVolumeSpecName: "kube-api-access-cz6pt") pod "3d629c8d-ebc5-4c3d-8173-c33d27151799" (UID: "3d629c8d-ebc5-4c3d-8173-c33d27151799"). InnerVolumeSpecName "kube-api-access-cz6pt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:43:22.313248 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.313218 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-util" (OuterVolumeSpecName: "util") pod "3d629c8d-ebc5-4c3d-8173-c33d27151799" (UID: "3d629c8d-ebc5-4c3d-8173-c33d27151799"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:43:22.409403 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.409375 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:43:22.409403 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.409400 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d629c8d-ebc5-4c3d-8173-c33d27151799-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:43:22.409554 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:22.409414 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cz6pt\" (UniqueName: \"kubernetes.io/projected/3d629c8d-ebc5-4c3d-8173-c33d27151799-kube-api-access-cz6pt\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:43:23.115134 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:23.115102 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" Apr 22 18:43:23.115295 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:23.115096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dw6gmr" event={"ID":"3d629c8d-ebc5-4c3d-8173-c33d27151799","Type":"ContainerDied","Data":"b6d168cbcba1bd92b0bc97c12ae51df7b4462996575b4678ebd05e601f0ffaff"} Apr 22 18:43:23.115295 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:23.115221 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6d168cbcba1bd92b0bc97c12ae51df7b4462996575b4678ebd05e601f0ffaff" Apr 22 18:43:29.091886 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.091798 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6"] Apr 22 18:43:29.092341 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.092282 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerName="extract" Apr 22 18:43:29.092341 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.092301 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerName="extract" Apr 22 18:43:29.092341 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.092337 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerName="util" Apr 22 18:43:29.092493 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.092346 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerName="util" Apr 22 18:43:29.092493 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.092359 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerName="pull" Apr 22 18:43:29.092493 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.092369 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerName="pull" Apr 22 18:43:29.092493 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.092447 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d629c8d-ebc5-4c3d-8173-c33d27151799" containerName="extract" Apr 22 18:43:29.095319 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.095297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" Apr 22 18:43:29.098074 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.098047 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 18:43:29.098193 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.098169 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-rdd9k\"" Apr 22 18:43:29.098638 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.098621 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:43:29.107449 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.107428 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6"] Apr 22 18:43:29.164125 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.164097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8a07f6e-bb60-4d49-97b4-312b53aaaef3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-f5mr6\" (UID: \"a8a07f6e-bb60-4d49-97b4-312b53aaaef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" Apr 22 18:43:29.164244 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.164149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8gts\" (UniqueName: \"kubernetes.io/projected/a8a07f6e-bb60-4d49-97b4-312b53aaaef3-kube-api-access-l8gts\") pod \"cert-manager-operator-controller-manager-54b9655956-f5mr6\" (UID: \"a8a07f6e-bb60-4d49-97b4-312b53aaaef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" Apr 22 18:43:29.265190 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.265163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8gts\" (UniqueName: \"kubernetes.io/projected/a8a07f6e-bb60-4d49-97b4-312b53aaaef3-kube-api-access-l8gts\") pod \"cert-manager-operator-controller-manager-54b9655956-f5mr6\" (UID: \"a8a07f6e-bb60-4d49-97b4-312b53aaaef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" Apr 22 18:43:29.265300 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.265226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8a07f6e-bb60-4d49-97b4-312b53aaaef3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-f5mr6\" (UID: \"a8a07f6e-bb60-4d49-97b4-312b53aaaef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" Apr 22 18:43:29.265545 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.265530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8a07f6e-bb60-4d49-97b4-312b53aaaef3-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-f5mr6\" (UID: \"a8a07f6e-bb60-4d49-97b4-312b53aaaef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" Apr 22 18:43:29.277343 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.277321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8gts\" (UniqueName: \"kubernetes.io/projected/a8a07f6e-bb60-4d49-97b4-312b53aaaef3-kube-api-access-l8gts\") pod \"cert-manager-operator-controller-manager-54b9655956-f5mr6\" (UID: \"a8a07f6e-bb60-4d49-97b4-312b53aaaef3\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" Apr 22 18:43:29.404765 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.404743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" Apr 22 18:43:29.547143 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:29.547120 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6"] Apr 22 18:43:29.549866 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:43:29.549833 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a07f6e_bb60_4d49_97b4_312b53aaaef3.slice/crio-495bc9cec68aba1d0c47ed9c559390af0337bfee29c4d2e8737984c22674ab61 WatchSource:0}: Error finding container 495bc9cec68aba1d0c47ed9c559390af0337bfee29c4d2e8737984c22674ab61: Status 404 returned error can't find the container with id 495bc9cec68aba1d0c47ed9c559390af0337bfee29c4d2e8737984c22674ab61 Apr 22 18:43:30.137994 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:30.137962 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" event={"ID":"a8a07f6e-bb60-4d49-97b4-312b53aaaef3","Type":"ContainerStarted","Data":"495bc9cec68aba1d0c47ed9c559390af0337bfee29c4d2e8737984c22674ab61"} Apr 22 18:43:33.149181 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:33.149146 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" event={"ID":"a8a07f6e-bb60-4d49-97b4-312b53aaaef3","Type":"ContainerStarted","Data":"452480d348803cb61f6478adedaaacb2c37dc6d9c3e702e41d96fb21c76d2f90"} Apr 22 18:43:33.177508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:33.177394 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-f5mr6" podStartSLOduration=1.505867334 podStartE2EDuration="4.17737524s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:29.552278341 +0000 UTC m=+345.190803441" lastFinishedPulling="2026-04-22 18:43:32.223786237 +0000 UTC m=+347.862311347" observedRunningTime="2026-04-22 18:43:33.175644897 +0000 UTC m=+348.814170019" watchObservedRunningTime="2026-04-22 18:43:33.17737524 +0000 UTC m=+348.815900365" Apr 22 18:43:38.029895 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.029864 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-c6sch"] Apr 22 18:43:38.033180 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.033161 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" Apr 22 18:43:38.035286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.035257 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.035993 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.035969 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-v2mxh\"" Apr 22 18:43:38.036093 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.035972 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.040662 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.040388 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-c6sch"] Apr 22 18:43:38.138212 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.138172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjq8d\" (UniqueName: \"kubernetes.io/projected/9dd29736-4806-4528-b0db-0da1160ed3d2-kube-api-access-bjq8d\") pod \"cert-manager-cainjector-68b757865b-c6sch\" (UID: \"9dd29736-4806-4528-b0db-0da1160ed3d2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" Apr 22 18:43:38.138366 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.138239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dd29736-4806-4528-b0db-0da1160ed3d2-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-c6sch\" (UID: \"9dd29736-4806-4528-b0db-0da1160ed3d2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" Apr 22 18:43:38.239104 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.239071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dd29736-4806-4528-b0db-0da1160ed3d2-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-c6sch\" (UID: \"9dd29736-4806-4528-b0db-0da1160ed3d2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" Apr 22 18:43:38.239275 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.239138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjq8d\" (UniqueName: \"kubernetes.io/projected/9dd29736-4806-4528-b0db-0da1160ed3d2-kube-api-access-bjq8d\") pod \"cert-manager-cainjector-68b757865b-c6sch\" (UID: \"9dd29736-4806-4528-b0db-0da1160ed3d2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" Apr 22 18:43:38.248052 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.248025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9dd29736-4806-4528-b0db-0da1160ed3d2-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-c6sch\" (UID: \"9dd29736-4806-4528-b0db-0da1160ed3d2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" Apr 22 18:43:38.248239 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.248217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjq8d\" (UniqueName: \"kubernetes.io/projected/9dd29736-4806-4528-b0db-0da1160ed3d2-kube-api-access-bjq8d\") pod \"cert-manager-cainjector-68b757865b-c6sch\" (UID: \"9dd29736-4806-4528-b0db-0da1160ed3d2\") " pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" Apr 22 18:43:38.317858 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.317789 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs"] Apr 22 18:43:38.321199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.321181 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.323540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.323519 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.323613 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.323525 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-94mmb\"" Apr 22 18:43:38.323613 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.323559 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.331306 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.331281 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs"] Apr 22 18:43:38.352257 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.352231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" Apr 22 18:43:38.440405 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.440376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.440521 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.440420 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5dh\" (UniqueName: \"kubernetes.io/projected/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-kube-api-access-xl5dh\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.440602 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.440577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.472060 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.472033 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-c6sch"] Apr 22 18:43:38.473411 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:43:38.473385 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dd29736_4806_4528_b0db_0da1160ed3d2.slice/crio-cb0003649468424e7ec4a6145b3ee64541c5f79f8fb63c739d54cf6e964f179d WatchSource:0}: Error finding container cb0003649468424e7ec4a6145b3ee64541c5f79f8fb63c739d54cf6e964f179d: Status 404 returned error can't find the container with id cb0003649468424e7ec4a6145b3ee64541c5f79f8fb63c739d54cf6e964f179d Apr 22 18:43:38.541521 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.541498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.541625 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.541566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.541625 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.541594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl5dh\" (UniqueName: \"kubernetes.io/projected/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-kube-api-access-xl5dh\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.541953 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.541929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.542007 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.541980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.551324 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.551303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl5dh\" (UniqueName: \"kubernetes.io/projected/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-kube-api-access-xl5dh\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.631344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.631276 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:38.750357 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:38.750336 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs"] Apr 22 18:43:38.752374 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:43:38.752349 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e9423a_a3ac_423f_bdc2_67ebeee56d46.slice/crio-6eb2a9cd995a31196df57707128c3ac0a21a5380138f94cc5243c1b4655e7782 WatchSource:0}: Error finding container 6eb2a9cd995a31196df57707128c3ac0a21a5380138f94cc5243c1b4655e7782: Status 404 returned error can't find the container with id 6eb2a9cd995a31196df57707128c3ac0a21a5380138f94cc5243c1b4655e7782 Apr 22 18:43:39.169876 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:39.169831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" event={"ID":"9dd29736-4806-4528-b0db-0da1160ed3d2","Type":"ContainerStarted","Data":"cb0003649468424e7ec4a6145b3ee64541c5f79f8fb63c739d54cf6e964f179d"} Apr 22 18:43:39.171987 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:39.171957 2577 generic.go:358] "Generic (PLEG): container finished" podID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerID="4cc940f2a09902682966f9a74501f7c26e227fe16a5972b29f96ccbc307946ee" exitCode=0 Apr 22 18:43:39.172114 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:39.172006 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" event={"ID":"b7e9423a-a3ac-423f-bdc2-67ebeee56d46","Type":"ContainerDied","Data":"4cc940f2a09902682966f9a74501f7c26e227fe16a5972b29f96ccbc307946ee"} Apr 22 18:43:39.172114 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:39.172028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" event={"ID":"b7e9423a-a3ac-423f-bdc2-67ebeee56d46","Type":"ContainerStarted","Data":"6eb2a9cd995a31196df57707128c3ac0a21a5380138f94cc5243c1b4655e7782"} Apr 22 18:43:42.184919 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:42.184882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" event={"ID":"9dd29736-4806-4528-b0db-0da1160ed3d2","Type":"ContainerStarted","Data":"a3325151836a10c056ee108439845ba9225658a161d2224f64b6b023490a9b97"} Apr 22 18:43:42.186554 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:42.186528 2577 generic.go:358] "Generic (PLEG): container finished" podID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerID="44a890845ca4aefa43ca7bf37f1ac76e50d191067ef1954d27ad390685bad4c9" exitCode=0 Apr 22 18:43:42.186695 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:42.186570 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" event={"ID":"b7e9423a-a3ac-423f-bdc2-67ebeee56d46","Type":"ContainerDied","Data":"44a890845ca4aefa43ca7bf37f1ac76e50d191067ef1954d27ad390685bad4c9"} Apr 22 18:43:42.202038 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:42.201985 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-c6sch" podStartSLOduration=0.995728831 podStartE2EDuration="4.201968565s" podCreationTimestamp="2026-04-22 18:43:38 +0000 UTC" firstStartedPulling="2026-04-22 18:43:38.475300388 +0000 UTC m=+354.113825488" lastFinishedPulling="2026-04-22 18:43:41.681540106 +0000 UTC m=+357.320065222" observedRunningTime="2026-04-22 18:43:42.199913231 +0000 UTC m=+357.838438354" watchObservedRunningTime="2026-04-22 18:43:42.201968565 +0000 UTC m=+357.840493689" Apr 22 18:43:43.196279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:43.196245 2577 generic.go:358] "Generic (PLEG): container finished" podID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerID="ec7f4bf122dfaa202e079e0b5d53dbf4f5f6aa314f03e19334b6a614c90c936f" exitCode=0 Apr 22 18:43:43.196633 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:43.196320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" event={"ID":"b7e9423a-a3ac-423f-bdc2-67ebeee56d46","Type":"ContainerDied","Data":"ec7f4bf122dfaa202e079e0b5d53dbf4f5f6aa314f03e19334b6a614c90c936f"} Apr 22 18:43:44.314263 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.314242 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:44.398529 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.398497 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl5dh\" (UniqueName: \"kubernetes.io/projected/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-kube-api-access-xl5dh\") pod \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " Apr 22 18:43:44.398702 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.398686 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-util\") pod \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " Apr 22 18:43:44.398799 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.398745 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-bundle\") pod \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\" (UID: \"b7e9423a-a3ac-423f-bdc2-67ebeee56d46\") " Apr 22 18:43:44.399159 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.399130 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-bundle" (OuterVolumeSpecName: "bundle") pod "b7e9423a-a3ac-423f-bdc2-67ebeee56d46" (UID: "b7e9423a-a3ac-423f-bdc2-67ebeee56d46"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:43:44.400539 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.400511 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-kube-api-access-xl5dh" (OuterVolumeSpecName: "kube-api-access-xl5dh") pod "b7e9423a-a3ac-423f-bdc2-67ebeee56d46" (UID: "b7e9423a-a3ac-423f-bdc2-67ebeee56d46"). InnerVolumeSpecName "kube-api-access-xl5dh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:43:44.403278 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.403243 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-util" (OuterVolumeSpecName: "util") pod "b7e9423a-a3ac-423f-bdc2-67ebeee56d46" (UID: "b7e9423a-a3ac-423f-bdc2-67ebeee56d46"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:43:44.499731 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.499635 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xl5dh\" (UniqueName: \"kubernetes.io/projected/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-kube-api-access-xl5dh\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:43:44.499731 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.499672 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:43:44.499731 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:44.499684 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7e9423a-a3ac-423f-bdc2-67ebeee56d46-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:43:45.205660 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:45.205636 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" Apr 22 18:43:45.205660 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:45.205641 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4mdrs" event={"ID":"b7e9423a-a3ac-423f-bdc2-67ebeee56d46","Type":"ContainerDied","Data":"6eb2a9cd995a31196df57707128c3ac0a21a5380138f94cc5243c1b4655e7782"} Apr 22 18:43:45.205660 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:45.205671 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eb2a9cd995a31196df57707128c3ac0a21a5380138f94cc5243c1b4655e7782" Apr 22 18:43:54.796866 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.796833 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-wf5r7"] Apr 22 18:43:54.797304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.797179 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerName="util" Apr 22 18:43:54.797304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.797193 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerName="util" Apr 22 18:43:54.797304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.797201 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerName="pull" Apr 22 18:43:54.797304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.797207 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerName="pull" Apr 22 18:43:54.797304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.797218 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerName="extract" Apr 22 18:43:54.797304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.797223 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerName="extract" Apr 22 18:43:54.797304 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.797281 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7e9423a-a3ac-423f-bdc2-67ebeee56d46" containerName="extract" Apr 22 18:43:54.882010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.881975 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-wf5r7"] Apr 22 18:43:54.882725 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.882694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-wf5r7" Apr 22 18:43:54.885372 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.885348 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-hn7ct\"" Apr 22 18:43:54.982933 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.982898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3970dcd8-182c-4ae5-bc3f-7b8a67873b58-bound-sa-token\") pod \"cert-manager-79c8d999ff-wf5r7\" (UID: \"3970dcd8-182c-4ae5-bc3f-7b8a67873b58\") " pod="cert-manager/cert-manager-79c8d999ff-wf5r7" Apr 22 18:43:54.982933 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:54.982934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6jnl\" (UniqueName: \"kubernetes.io/projected/3970dcd8-182c-4ae5-bc3f-7b8a67873b58-kube-api-access-n6jnl\") pod \"cert-manager-79c8d999ff-wf5r7\" (UID: \"3970dcd8-182c-4ae5-bc3f-7b8a67873b58\") " pod="cert-manager/cert-manager-79c8d999ff-wf5r7" Apr 22 18:43:55.084332 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:55.084264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3970dcd8-182c-4ae5-bc3f-7b8a67873b58-bound-sa-token\") pod \"cert-manager-79c8d999ff-wf5r7\" (UID: \"3970dcd8-182c-4ae5-bc3f-7b8a67873b58\") " pod="cert-manager/cert-manager-79c8d999ff-wf5r7" Apr 22 18:43:55.084332 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:55.084302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6jnl\" (UniqueName: \"kubernetes.io/projected/3970dcd8-182c-4ae5-bc3f-7b8a67873b58-kube-api-access-n6jnl\") pod \"cert-manager-79c8d999ff-wf5r7\" (UID: \"3970dcd8-182c-4ae5-bc3f-7b8a67873b58\") " pod="cert-manager/cert-manager-79c8d999ff-wf5r7" Apr 22 18:43:55.093277 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:55.093247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3970dcd8-182c-4ae5-bc3f-7b8a67873b58-bound-sa-token\") pod \"cert-manager-79c8d999ff-wf5r7\" (UID: \"3970dcd8-182c-4ae5-bc3f-7b8a67873b58\") " pod="cert-manager/cert-manager-79c8d999ff-wf5r7" Apr 22 18:43:55.093277 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:55.093271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6jnl\" (UniqueName: \"kubernetes.io/projected/3970dcd8-182c-4ae5-bc3f-7b8a67873b58-kube-api-access-n6jnl\") pod \"cert-manager-79c8d999ff-wf5r7\" (UID: \"3970dcd8-182c-4ae5-bc3f-7b8a67873b58\") " pod="cert-manager/cert-manager-79c8d999ff-wf5r7" Apr 22 18:43:55.192007 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:55.191980 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-wf5r7" Apr 22 18:43:55.322521 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:55.322391 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-wf5r7"] Apr 22 18:43:55.325224 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:43:55.325196 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3970dcd8_182c_4ae5_bc3f_7b8a67873b58.slice/crio-c528fbb008bfe3dbf5c902d299fd5637fbfe94a169b742e00455fa1a543d2790 WatchSource:0}: Error finding container c528fbb008bfe3dbf5c902d299fd5637fbfe94a169b742e00455fa1a543d2790: Status 404 returned error can't find the container with id c528fbb008bfe3dbf5c902d299fd5637fbfe94a169b742e00455fa1a543d2790 Apr 22 18:43:56.244240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:56.244203 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-wf5r7" event={"ID":"3970dcd8-182c-4ae5-bc3f-7b8a67873b58","Type":"ContainerStarted","Data":"6dafb4212902e43be42d9adae005a7078753b240d1855b101f4a2d6cf1e3f559"} Apr 22 18:43:56.244608 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:56.244248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-wf5r7" event={"ID":"3970dcd8-182c-4ae5-bc3f-7b8a67873b58","Type":"ContainerStarted","Data":"c528fbb008bfe3dbf5c902d299fd5637fbfe94a169b742e00455fa1a543d2790"} Apr 22 18:43:56.266341 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:43:56.266298 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-wf5r7" podStartSLOduration=2.266284937 podStartE2EDuration="2.266284937s" podCreationTimestamp="2026-04-22 18:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:43:56.265102418 +0000 UTC m=+371.903627542" watchObservedRunningTime="2026-04-22 18:43:56.266284937 +0000 UTC m=+371.904810058" Apr 22 18:44:04.460126 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.460093 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5"] Apr 22 18:44:04.463701 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.463684 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.466110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.466081 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:44:04.466110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.466083 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:44:04.466289 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.466142 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-94mmb\"" Apr 22 18:44:04.473539 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.473518 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5"] Apr 22 18:44:04.563262 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.563223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7r8h\" (UniqueName: \"kubernetes.io/projected/b478484f-9602-4312-af64-e122aacb91a3-kube-api-access-j7r8h\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.563430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.563321 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.563430 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.563350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.664460 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.664428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7r8h\" (UniqueName: \"kubernetes.io/projected/b478484f-9602-4312-af64-e122aacb91a3-kube-api-access-j7r8h\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.664632 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.664495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.664632 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.664517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.664855 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.664838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.664907 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.664879 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.673451 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.673424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7r8h\" (UniqueName: \"kubernetes.io/projected/b478484f-9602-4312-af64-e122aacb91a3-kube-api-access-j7r8h\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.774265 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.774201 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:04.893575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:04.893554 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5"] Apr 22 18:44:04.896296 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:44:04.896269 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb478484f_9602_4312_af64_e122aacb91a3.slice/crio-38f365ea17c6fd6da8edea11968d149985925247aee6ca97c9d3d0fb6d564fad WatchSource:0}: Error finding container 38f365ea17c6fd6da8edea11968d149985925247aee6ca97c9d3d0fb6d564fad: Status 404 returned error can't find the container with id 38f365ea17c6fd6da8edea11968d149985925247aee6ca97c9d3d0fb6d564fad Apr 22 18:44:05.275434 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:05.275401 2577 generic.go:358] "Generic (PLEG): container finished" podID="b478484f-9602-4312-af64-e122aacb91a3" containerID="706a3f224802a1d032c37a0ea558f9023537c823c891e3b4f0a603db114a9ffb" exitCode=0 Apr 22 18:44:05.275636 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:05.275451 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" event={"ID":"b478484f-9602-4312-af64-e122aacb91a3","Type":"ContainerDied","Data":"706a3f224802a1d032c37a0ea558f9023537c823c891e3b4f0a603db114a9ffb"} Apr 22 18:44:05.275636 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:05.275476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" event={"ID":"b478484f-9602-4312-af64-e122aacb91a3","Type":"ContainerStarted","Data":"38f365ea17c6fd6da8edea11968d149985925247aee6ca97c9d3d0fb6d564fad"} Apr 22 18:44:07.282435 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:07.282398 2577 generic.go:358] "Generic (PLEG): container finished" podID="b478484f-9602-4312-af64-e122aacb91a3" containerID="4a15b1003e9070eb53f7f59f7195306f3e78d6c0db52f4c4f2352cf49b3aa394" exitCode=0 Apr 22 18:44:07.282826 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:07.282488 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" event={"ID":"b478484f-9602-4312-af64-e122aacb91a3","Type":"ContainerDied","Data":"4a15b1003e9070eb53f7f59f7195306f3e78d6c0db52f4c4f2352cf49b3aa394"} Apr 22 18:44:08.288150 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:08.288113 2577 generic.go:358] "Generic (PLEG): container finished" podID="b478484f-9602-4312-af64-e122aacb91a3" containerID="c84186c2262d734dd183af64b9fcf4cd46226e059f2c4f008eb81659d00ac214" exitCode=0 Apr 22 18:44:08.288150 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:08.288150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" event={"ID":"b478484f-9602-4312-af64-e122aacb91a3","Type":"ContainerDied","Data":"c84186c2262d734dd183af64b9fcf4cd46226e059f2c4f008eb81659d00ac214"} Apr 22 18:44:09.417876 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.417853 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:09.504415 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.504382 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-bundle\") pod \"b478484f-9602-4312-af64-e122aacb91a3\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " Apr 22 18:44:09.504575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.504428 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-util\") pod \"b478484f-9602-4312-af64-e122aacb91a3\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " Apr 22 18:44:09.504575 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.504486 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7r8h\" (UniqueName: \"kubernetes.io/projected/b478484f-9602-4312-af64-e122aacb91a3-kube-api-access-j7r8h\") pod \"b478484f-9602-4312-af64-e122aacb91a3\" (UID: \"b478484f-9602-4312-af64-e122aacb91a3\") " Apr 22 18:44:09.505317 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.505283 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-bundle" (OuterVolumeSpecName: "bundle") pod "b478484f-9602-4312-af64-e122aacb91a3" (UID: "b478484f-9602-4312-af64-e122aacb91a3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:44:09.506617 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.506589 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b478484f-9602-4312-af64-e122aacb91a3-kube-api-access-j7r8h" (OuterVolumeSpecName: "kube-api-access-j7r8h") pod "b478484f-9602-4312-af64-e122aacb91a3" (UID: "b478484f-9602-4312-af64-e122aacb91a3"). InnerVolumeSpecName "kube-api-access-j7r8h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:44:09.509929 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.509907 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-util" (OuterVolumeSpecName: "util") pod "b478484f-9602-4312-af64-e122aacb91a3" (UID: "b478484f-9602-4312-af64-e122aacb91a3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:44:09.605160 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.605090 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:44:09.605160 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.605115 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b478484f-9602-4312-af64-e122aacb91a3-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:44:09.605160 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:09.605124 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j7r8h\" (UniqueName: \"kubernetes.io/projected/b478484f-9602-4312-af64-e122aacb91a3-kube-api-access-j7r8h\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:44:10.296947 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:10.296911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" event={"ID":"b478484f-9602-4312-af64-e122aacb91a3","Type":"ContainerDied","Data":"38f365ea17c6fd6da8edea11968d149985925247aee6ca97c9d3d0fb6d564fad"} Apr 22 18:44:10.296947 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:10.296946 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f365ea17c6fd6da8edea11968d149985925247aee6ca97c9d3d0fb6d564fad" Apr 22 18:44:10.297157 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:10.296964 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835n2mj5" Apr 22 18:44:18.127539 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.127501 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg"] Apr 22 18:44:18.127911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.127877 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b478484f-9602-4312-af64-e122aacb91a3" containerName="util" Apr 22 18:44:18.127911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.127889 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b478484f-9602-4312-af64-e122aacb91a3" containerName="util" Apr 22 18:44:18.127911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.127905 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b478484f-9602-4312-af64-e122aacb91a3" containerName="pull" Apr 22 18:44:18.127911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.127910 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b478484f-9602-4312-af64-e122aacb91a3" containerName="pull" Apr 22 18:44:18.128039 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.127920 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b478484f-9602-4312-af64-e122aacb91a3" containerName="extract" Apr 22 18:44:18.128039 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.127926 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b478484f-9602-4312-af64-e122aacb91a3" containerName="extract" Apr 22 18:44:18.128039 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.127974 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b478484f-9602-4312-af64-e122aacb91a3" containerName="extract" Apr 22 18:44:18.132177 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.132160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.135144 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.135124 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-94mmb\"" Apr 22 18:44:18.135537 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.135522 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:44:18.136019 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.136003 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:44:18.145120 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.145097 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg"] Apr 22 18:44:18.171425 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.171400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.171519 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.171447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vdlq\" (UniqueName: \"kubernetes.io/projected/43706a9f-39ec-454c-8507-4116901ee9c4-kube-api-access-2vdlq\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.171562 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.171516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.272218 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.272185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.272399 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.272245 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.272399 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.272292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vdlq\" (UniqueName: \"kubernetes.io/projected/43706a9f-39ec-454c-8507-4116901ee9c4-kube-api-access-2vdlq\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.272618 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.272595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.272675 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.272645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.287397 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.287366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vdlq\" (UniqueName: \"kubernetes.io/projected/43706a9f-39ec-454c-8507-4116901ee9c4-kube-api-access-2vdlq\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.440825 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.440800 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:18.583777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:18.583753 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg"] Apr 22 18:44:18.586639 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:44:18.586609 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43706a9f_39ec_454c_8507_4116901ee9c4.slice/crio-09acaf4b498f95b95bf7aba92fe630c96ee73929a8e52edc4a66c8f55cbd3081 WatchSource:0}: Error finding container 09acaf4b498f95b95bf7aba92fe630c96ee73929a8e52edc4a66c8f55cbd3081: Status 404 returned error can't find the container with id 09acaf4b498f95b95bf7aba92fe630c96ee73929a8e52edc4a66c8f55cbd3081 Apr 22 18:44:19.327659 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.327620 2577 generic.go:358] "Generic (PLEG): container finished" podID="43706a9f-39ec-454c-8507-4116901ee9c4" containerID="039a2f0fdd0aff53ed8d4b02bc6744513364c849acb2f1d3fe3a948d8366120c" exitCode=0 Apr 22 18:44:19.328137 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.327658 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" event={"ID":"43706a9f-39ec-454c-8507-4116901ee9c4","Type":"ContainerDied","Data":"039a2f0fdd0aff53ed8d4b02bc6744513364c849acb2f1d3fe3a948d8366120c"} Apr 22 18:44:19.328137 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.327704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" event={"ID":"43706a9f-39ec-454c-8507-4116901ee9c4","Type":"ContainerStarted","Data":"09acaf4b498f95b95bf7aba92fe630c96ee73929a8e52edc4a66c8f55cbd3081"} Apr 22 18:44:19.557489 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.557408 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk"] Apr 22 18:44:19.560703 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.560683 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:19.562935 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.562913 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-zp84m\"" Apr 22 18:44:19.563034 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.562914 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 18:44:19.563034 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.562993 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 18:44:19.572080 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.572060 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk"] Apr 22 18:44:19.585452 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.585423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjcjp\" (UniqueName: \"kubernetes.io/projected/2ea24e6b-f0a4-4f1b-b108-b5e64378db21-kube-api-access-xjcjp\") pod \"servicemesh-operator3-55f49c5f94-tj5lk\" (UID: \"2ea24e6b-f0a4-4f1b-b108-b5e64378db21\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:19.585552 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.585485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2ea24e6b-f0a4-4f1b-b108-b5e64378db21-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tj5lk\" (UID: \"2ea24e6b-f0a4-4f1b-b108-b5e64378db21\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:19.686265 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.686236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjcjp\" (UniqueName: \"kubernetes.io/projected/2ea24e6b-f0a4-4f1b-b108-b5e64378db21-kube-api-access-xjcjp\") pod \"servicemesh-operator3-55f49c5f94-tj5lk\" (UID: \"2ea24e6b-f0a4-4f1b-b108-b5e64378db21\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:19.686436 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.686276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2ea24e6b-f0a4-4f1b-b108-b5e64378db21-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tj5lk\" (UID: \"2ea24e6b-f0a4-4f1b-b108-b5e64378db21\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:19.688770 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.688752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/2ea24e6b-f0a4-4f1b-b108-b5e64378db21-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tj5lk\" (UID: \"2ea24e6b-f0a4-4f1b-b108-b5e64378db21\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:19.698530 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.698509 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjcjp\" (UniqueName: \"kubernetes.io/projected/2ea24e6b-f0a4-4f1b-b108-b5e64378db21-kube-api-access-xjcjp\") pod \"servicemesh-operator3-55f49c5f94-tj5lk\" (UID: \"2ea24e6b-f0a4-4f1b-b108-b5e64378db21\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:19.871279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:19.871209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:20.008782 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:20.008751 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk"] Apr 22 18:44:20.010928 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:44:20.010899 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ea24e6b_f0a4_4f1b_b108_b5e64378db21.slice/crio-4adb239a702816fe6fbc82dbde8554ce7cb3f102c320598fa7644371cbfa79fe WatchSource:0}: Error finding container 4adb239a702816fe6fbc82dbde8554ce7cb3f102c320598fa7644371cbfa79fe: Status 404 returned error can't find the container with id 4adb239a702816fe6fbc82dbde8554ce7cb3f102c320598fa7644371cbfa79fe Apr 22 18:44:20.335655 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:20.335618 2577 generic.go:358] "Generic (PLEG): container finished" podID="43706a9f-39ec-454c-8507-4116901ee9c4" containerID="02a818bb216f77cbc8841373caff99afcca2fedd31bf66024787e727b4d03b19" exitCode=0 Apr 22 18:44:20.336071 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:20.335691 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" event={"ID":"43706a9f-39ec-454c-8507-4116901ee9c4","Type":"ContainerDied","Data":"02a818bb216f77cbc8841373caff99afcca2fedd31bf66024787e727b4d03b19"} Apr 22 18:44:20.336808 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:20.336778 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" event={"ID":"2ea24e6b-f0a4-4f1b-b108-b5e64378db21","Type":"ContainerStarted","Data":"4adb239a702816fe6fbc82dbde8554ce7cb3f102c320598fa7644371cbfa79fe"} Apr 22 18:44:21.344160 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:21.344113 2577 generic.go:358] "Generic (PLEG): container finished" podID="43706a9f-39ec-454c-8507-4116901ee9c4" containerID="090936153c9d8f6757f5eb49f15365e15ee70dd21913254af682ddd15904c76a" exitCode=0 Apr 22 18:44:21.344553 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:21.344159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" event={"ID":"43706a9f-39ec-454c-8507-4116901ee9c4","Type":"ContainerDied","Data":"090936153c9d8f6757f5eb49f15365e15ee70dd21913254af682ddd15904c76a"} Apr 22 18:44:22.706382 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.706362 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:22.814899 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.814879 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-util\") pod \"43706a9f-39ec-454c-8507-4116901ee9c4\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " Apr 22 18:44:22.814994 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.814946 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-bundle\") pod \"43706a9f-39ec-454c-8507-4116901ee9c4\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " Apr 22 18:44:22.814994 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.814972 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vdlq\" (UniqueName: \"kubernetes.io/projected/43706a9f-39ec-454c-8507-4116901ee9c4-kube-api-access-2vdlq\") pod \"43706a9f-39ec-454c-8507-4116901ee9c4\" (UID: \"43706a9f-39ec-454c-8507-4116901ee9c4\") " Apr 22 18:44:22.815835 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.815793 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-bundle" (OuterVolumeSpecName: "bundle") pod "43706a9f-39ec-454c-8507-4116901ee9c4" (UID: "43706a9f-39ec-454c-8507-4116901ee9c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:44:22.816890 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.816859 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43706a9f-39ec-454c-8507-4116901ee9c4-kube-api-access-2vdlq" (OuterVolumeSpecName: "kube-api-access-2vdlq") pod "43706a9f-39ec-454c-8507-4116901ee9c4" (UID: "43706a9f-39ec-454c-8507-4116901ee9c4"). InnerVolumeSpecName "kube-api-access-2vdlq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:44:22.822362 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.822340 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-util" (OuterVolumeSpecName: "util") pod "43706a9f-39ec-454c-8507-4116901ee9c4" (UID: "43706a9f-39ec-454c-8507-4116901ee9c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:44:22.916008 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.915977 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:44:22.916008 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.916011 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vdlq\" (UniqueName: \"kubernetes.io/projected/43706a9f-39ec-454c-8507-4116901ee9c4-kube-api-access-2vdlq\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:44:22.916494 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:22.916027 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43706a9f-39ec-454c-8507-4116901ee9c4-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:44:23.353777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:23.353709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" event={"ID":"43706a9f-39ec-454c-8507-4116901ee9c4","Type":"ContainerDied","Data":"09acaf4b498f95b95bf7aba92fe630c96ee73929a8e52edc4a66c8f55cbd3081"} Apr 22 18:44:23.353777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:23.353772 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09acaf4b498f95b95bf7aba92fe630c96ee73929a8e52edc4a66c8f55cbd3081" Apr 22 18:44:23.354006 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:23.353744 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebqgdlg" Apr 22 18:44:23.355517 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:23.355485 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" event={"ID":"2ea24e6b-f0a4-4f1b-b108-b5e64378db21","Type":"ContainerStarted","Data":"995b2fefcc6ea337210e0e1c4ad5e037a00d4a71e15a58b0ce5e9f2a6bc932c3"} Apr 22 18:44:23.355633 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:23.355569 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:23.380291 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:23.380249 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" podStartSLOduration=1.64339356 podStartE2EDuration="4.38023645s" podCreationTimestamp="2026-04-22 18:44:19 +0000 UTC" firstStartedPulling="2026-04-22 18:44:20.013938855 +0000 UTC m=+395.652463962" lastFinishedPulling="2026-04-22 18:44:22.750781742 +0000 UTC m=+398.389306852" observedRunningTime="2026-04-22 18:44:23.37926 +0000 UTC m=+399.017785134" watchObservedRunningTime="2026-04-22 18:44:23.38023645 +0000 UTC m=+399.018761572" Apr 22 18:44:34.361216 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:34.361177 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tj5lk" Apr 22 18:44:46.457684 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.457650 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr"] Apr 22 18:44:46.458165 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.458148 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43706a9f-39ec-454c-8507-4116901ee9c4" containerName="extract" Apr 22 18:44:46.458208 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.458168 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="43706a9f-39ec-454c-8507-4116901ee9c4" containerName="extract" Apr 22 18:44:46.458208 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.458192 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43706a9f-39ec-454c-8507-4116901ee9c4" containerName="util" Apr 22 18:44:46.458208 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.458201 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="43706a9f-39ec-454c-8507-4116901ee9c4" containerName="util" Apr 22 18:44:46.458302 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.458218 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43706a9f-39ec-454c-8507-4116901ee9c4" containerName="pull" Apr 22 18:44:46.458302 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.458227 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="43706a9f-39ec-454c-8507-4116901ee9c4" containerName="pull" Apr 22 18:44:46.458362 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.458306 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="43706a9f-39ec-454c-8507-4116901ee9c4" containerName="extract" Apr 22 18:44:46.463324 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.463305 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.465395 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.465375 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 18:44:46.465505 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.465395 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 18:44:46.465633 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.465617 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 18:44:46.465743 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.465726 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 18:44:46.465813 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.465785 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-qrvws\"" Apr 22 18:44:46.471991 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.471968 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr"] Apr 22 18:44:46.609985 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.609951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.610138 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.610004 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.610138 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.610030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.610138 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.610090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.610138 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.610107 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dq4q\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-kube-api-access-9dq4q\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.610138 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.610124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.610301 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.610148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.711665 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.711574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.711665 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.711625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dq4q\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-kube-api-access-9dq4q\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.711665 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.711655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.711957 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.711702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.711957 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.711765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.711957 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.711935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.712114 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.711987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.712428 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.712403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.718075 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.717763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.718075 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.718039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.718462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.718432 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.718576 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.718489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.719995 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.719975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dq4q\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-kube-api-access-9dq4q\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.720322 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.720307 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-2k4tr\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.773621 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.773596 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:46.911236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:46.911204 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr"] Apr 22 18:44:46.912705 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:44:46.912672 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0b8e9e1_cfdb_4277_83c6_442ca4e761dc.slice/crio-02863ee48e1c1661ea77023b32d9a56f3e01c8be36340b1d65c3849e5ef093f4 WatchSource:0}: Error finding container 02863ee48e1c1661ea77023b32d9a56f3e01c8be36340b1d65c3849e5ef093f4: Status 404 returned error can't find the container with id 02863ee48e1c1661ea77023b32d9a56f3e01c8be36340b1d65c3849e5ef093f4 Apr 22 18:44:47.440406 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:47.440365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" event={"ID":"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc","Type":"ContainerStarted","Data":"02863ee48e1c1661ea77023b32d9a56f3e01c8be36340b1d65c3849e5ef093f4"} Apr 22 18:44:49.854600 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:49.854563 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:44:49.854881 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:49.854645 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:44:50.452846 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:50.452811 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" event={"ID":"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc","Type":"ContainerStarted","Data":"74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd"} Apr 22 18:44:50.453050 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:50.453032 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:50.454598 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:50.454573 2577 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-2k4tr container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 18:44:50.454736 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:50.454625 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" podUID="e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:44:50.475914 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:50.475868 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" podStartSLOduration=1.5361299590000002 podStartE2EDuration="4.475854159s" podCreationTimestamp="2026-04-22 18:44:46 +0000 UTC" firstStartedPulling="2026-04-22 18:44:46.914633255 +0000 UTC m=+422.553158369" lastFinishedPulling="2026-04-22 18:44:49.854357453 +0000 UTC m=+425.492882569" observedRunningTime="2026-04-22 18:44:50.474481158 +0000 UTC m=+426.113006280" watchObservedRunningTime="2026-04-22 18:44:50.475854159 +0000 UTC m=+426.114379283" Apr 22 18:44:51.456674 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:51.456644 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:44:52.713732 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.713686 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s"] Apr 22 18:44:52.722545 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.722518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.724949 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.724927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-q47gf\"" Apr 22 18:44:52.735320 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.735292 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s"] Apr 22 18:44:52.869255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.869424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.869424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.869424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.869424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.869561 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.869561 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.869561 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.869561 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.869539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7z5\" (UniqueName: \"kubernetes.io/projected/00a61ee1-c541-4c2e-9b18-cca0dafd449d-kube-api-access-nr7z5\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971005 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.970927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971005 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.970974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7z5\" (UniqueName: \"kubernetes.io/projected/00a61ee1-c541-4c2e-9b18-cca0dafd449d-kube-api-access-nr7z5\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971234 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971234 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971234 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971234 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971234 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971234 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971527 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971527 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971527 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971527 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971756 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.971820 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.971793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.973365 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.973343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.973639 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.973622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.979480 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.979459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/00a61ee1-c541-4c2e-9b18-cca0dafd449d-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:52.979582 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:52.979465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7z5\" (UniqueName: \"kubernetes.io/projected/00a61ee1-c541-4c2e-9b18-cca0dafd449d-kube-api-access-nr7z5\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-z9w6s\" (UID: \"00a61ee1-c541-4c2e-9b18-cca0dafd449d\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:53.036540 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:53.036515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:53.372147 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:53.372118 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s"] Apr 22 18:44:53.373969 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:44:53.373938 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a61ee1_c541_4c2e_9b18_cca0dafd449d.slice/crio-affdc93a2cb21dd936c973f231d41dfabe556276840d7af8b2882432e5448d17 WatchSource:0}: Error finding container affdc93a2cb21dd936c973f231d41dfabe556276840d7af8b2882432e5448d17: Status 404 returned error can't find the container with id affdc93a2cb21dd936c973f231d41dfabe556276840d7af8b2882432e5448d17 Apr 22 18:44:53.466153 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:53.466117 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" event={"ID":"00a61ee1-c541-4c2e-9b18-cca0dafd449d","Type":"ContainerStarted","Data":"affdc93a2cb21dd936c973f231d41dfabe556276840d7af8b2882432e5448d17"} Apr 22 18:44:54.075460 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.075424 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75cfdbbd69-4rwhn"] Apr 22 18:44:54.078348 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.078326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.081460 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.081437 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:44:54.081597 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.081546 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:44:54.082483 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.082462 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:44:54.085551 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.085428 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xkjq2\"" Apr 22 18:44:54.085551 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.085443 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:44:54.086460 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.086437 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:44:54.090027 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.090006 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75cfdbbd69-4rwhn"] Apr 22 18:44:54.095632 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.095613 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:44:54.182447 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.182414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16e4bc48-b665-438a-944b-bed7491377b7-console-serving-cert\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.182615 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.182466 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-console-config\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.182615 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.182534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-trusted-ca-bundle\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.182615 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.182559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-oauth-serving-cert\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.182784 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.182617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16e4bc48-b665-438a-944b-bed7491377b7-console-oauth-config\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.182784 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.182631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph9h5\" (UniqueName: \"kubernetes.io/projected/16e4bc48-b665-438a-944b-bed7491377b7-kube-api-access-ph9h5\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.182784 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.182649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-service-ca\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.283350 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.283313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-service-ca\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.283537 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.283385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16e4bc48-b665-438a-944b-bed7491377b7-console-serving-cert\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.283537 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.283441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-console-config\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.283537 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.283488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-trusted-ca-bundle\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.283537 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.283515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-oauth-serving-cert\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.283764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.283602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16e4bc48-b665-438a-944b-bed7491377b7-console-oauth-config\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.283764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.283627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9h5\" (UniqueName: \"kubernetes.io/projected/16e4bc48-b665-438a-944b-bed7491377b7-kube-api-access-ph9h5\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.284198 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.284171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-console-config\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.284306 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.284171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-service-ca\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.284346 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.284304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-trusted-ca-bundle\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.284579 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.284559 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16e4bc48-b665-438a-944b-bed7491377b7-oauth-serving-cert\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.286477 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.286453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16e4bc48-b665-438a-944b-bed7491377b7-console-serving-cert\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.286576 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.286558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16e4bc48-b665-438a-944b-bed7491377b7-console-oauth-config\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.297934 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.297907 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9h5\" (UniqueName: \"kubernetes.io/projected/16e4bc48-b665-438a-944b-bed7491377b7-kube-api-access-ph9h5\") pod \"console-75cfdbbd69-4rwhn\" (UID: \"16e4bc48-b665-438a-944b-bed7491377b7\") " pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.390003 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.389929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:44:54.542461 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:54.542418 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75cfdbbd69-4rwhn"] Apr 22 18:44:54.546861 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:44:54.546816 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16e4bc48_b665_438a_944b_bed7491377b7.slice/crio-4de0545ba0d41a7714cb068bd238db445909ed352b66988b354a45374220174f WatchSource:0}: Error finding container 4de0545ba0d41a7714cb068bd238db445909ed352b66988b354a45374220174f: Status 404 returned error can't find the container with id 4de0545ba0d41a7714cb068bd238db445909ed352b66988b354a45374220174f Apr 22 18:44:55.484240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:55.484189 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cfdbbd69-4rwhn" event={"ID":"16e4bc48-b665-438a-944b-bed7491377b7","Type":"ContainerStarted","Data":"a6840d3900b79345b3d77739c64c2e1328433f8bce69d96fdacd6b93cf97a196"} Apr 22 18:44:55.484240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:55.484243 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cfdbbd69-4rwhn" event={"ID":"16e4bc48-b665-438a-944b-bed7491377b7","Type":"ContainerStarted","Data":"4de0545ba0d41a7714cb068bd238db445909ed352b66988b354a45374220174f"} Apr 22 18:44:55.506413 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:55.506197 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75cfdbbd69-4rwhn" podStartSLOduration=1.5061794549999998 podStartE2EDuration="1.506179455s" podCreationTimestamp="2026-04-22 18:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:44:55.505884159 +0000 UTC m=+431.144409284" watchObservedRunningTime="2026-04-22 18:44:55.506179455 +0000 UTC m=+431.144704574" Apr 22 18:44:56.343192 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:56.343155 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:44:56.343300 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:56.343240 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:44:56.343300 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:56.343269 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:44:56.489673 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:56.489641 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" event={"ID":"00a61ee1-c541-4c2e-9b18-cca0dafd449d","Type":"ContainerStarted","Data":"2d181cb67e02d5ddc7873c5b38f59cd3ade8e4e66516b422ec2686bd765ec3e9"} Apr 22 18:44:56.513873 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:56.513823 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" podStartSLOduration=1.547070405 podStartE2EDuration="4.513808596s" podCreationTimestamp="2026-04-22 18:44:52 +0000 UTC" firstStartedPulling="2026-04-22 18:44:53.376170468 +0000 UTC m=+429.014695583" lastFinishedPulling="2026-04-22 18:44:56.342908668 +0000 UTC m=+431.981433774" observedRunningTime="2026-04-22 18:44:56.511194282 +0000 UTC m=+432.149719405" watchObservedRunningTime="2026-04-22 18:44:56.513808596 +0000 UTC m=+432.152333717" Apr 22 18:44:57.037315 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:57.037229 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:57.038725 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:57.038694 2577 patch_prober.go:28] interesting pod/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.34:15021/healthz/ready\": dial tcp 10.133.0.34:15021: connect: connection refused" start-of-body= Apr 22 18:44:57.038784 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:57.038754 2577 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" podUID="00a61ee1-c541-4c2e-9b18-cca0dafd449d" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.34:15021/healthz/ready\": dial tcp 10.133.0.34:15021: connect: connection refused" Apr 22 18:44:58.041284 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:58.041253 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:58.497401 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:58.497367 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:44:58.498152 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:44:58.498133 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-z9w6s" Apr 22 18:45:00.766497 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.766460 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj"] Apr 22 18:45:00.769995 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.769969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.772536 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.772508 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:45:00.772672 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.772656 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-94mmb\"" Apr 22 18:45:00.773433 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.773416 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:45:00.778863 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.778839 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj"] Apr 22 18:45:00.844241 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.844201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.844241 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.844245 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjm8z\" (UniqueName: \"kubernetes.io/projected/531b04b5-62b1-477e-a856-1997183fb66b-kube-api-access-gjm8z\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.844447 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.844333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.864211 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.864173 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll"] Apr 22 18:45:00.867665 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.867650 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:00.875567 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.875539 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll"] Apr 22 18:45:00.944995 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.944962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:00.945168 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.945012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rcr\" (UniqueName: \"kubernetes.io/projected/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-kube-api-access-l4rcr\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:00.945168 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.945047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.945168 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.945082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:00.945168 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.945116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.945168 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.945141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjm8z\" (UniqueName: \"kubernetes.io/projected/531b04b5-62b1-477e-a856-1997183fb66b-kube-api-access-gjm8z\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.945424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.945359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.945502 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.945479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.954254 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.954230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjm8z\" (UniqueName: \"kubernetes.io/projected/531b04b5-62b1-477e-a856-1997183fb66b-kube-api-access-gjm8z\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:00.969456 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.969428 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r"] Apr 22 18:45:00.973113 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.973093 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:00.981691 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:00.981669 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r"] Apr 22 18:45:01.046128 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.046039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vm48\" (UniqueName: \"kubernetes.io/projected/f26530c0-159a-4f80-b8f8-2e7f87601759-kube-api-access-7vm48\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.046128 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.046092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.046307 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.046177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:01.046307 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.046222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rcr\" (UniqueName: \"kubernetes.io/projected/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-kube-api-access-l4rcr\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:01.046307 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.046245 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.046307 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.046296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:01.046591 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.046573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:01.046591 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.046584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:01.059695 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.059667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rcr\" (UniqueName: \"kubernetes.io/projected/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-kube-api-access-l4rcr\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:01.067909 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.067885 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4"] Apr 22 18:45:01.071418 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.071403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.080170 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.080146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:01.081675 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.081653 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4"] Apr 22 18:45:01.147201 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.147174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.147358 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.147238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.147358 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.147272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.147358 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.147300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn8nb\" (UniqueName: \"kubernetes.io/projected/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-kube-api-access-gn8nb\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.147521 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.147371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vm48\" (UniqueName: \"kubernetes.io/projected/f26530c0-159a-4f80-b8f8-2e7f87601759-kube-api-access-7vm48\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.147521 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.147399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.147681 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.147618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.147681 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.147673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.156816 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.156790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vm48\" (UniqueName: \"kubernetes.io/projected/f26530c0-159a-4f80-b8f8-2e7f87601759-kube-api-access-7vm48\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.177841 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.177815 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:01.248339 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.248306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn8nb\" (UniqueName: \"kubernetes.io/projected/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-kube-api-access-gn8nb\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.248495 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.248418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.248495 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.248445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.248845 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.248828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.248918 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.248847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.257427 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.257402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn8nb\" (UniqueName: \"kubernetes.io/projected/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-kube-api-access-gn8nb\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.284249 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.284224 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:01.382759 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.382729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:01.407589 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.407496 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r"] Apr 22 18:45:01.410793 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:45:01.410711 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf26530c0_159a_4f80_b8f8_2e7f87601759.slice/crio-65e540fb40bc030e2ade4b51f38fcc71c291ecfb3d0246c3237ce77af83ab9b9 WatchSource:0}: Error finding container 65e540fb40bc030e2ade4b51f38fcc71c291ecfb3d0246c3237ce77af83ab9b9: Status 404 returned error can't find the container with id 65e540fb40bc030e2ade4b51f38fcc71c291ecfb3d0246c3237ce77af83ab9b9 Apr 22 18:45:01.413518 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.413497 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj"] Apr 22 18:45:01.415409 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:45:01.415388 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod531b04b5_62b1_477e_a856_1997183fb66b.slice/crio-0db27be18d81911592f2da15e5a6d359f946d936226be71eef18591b4a232062 WatchSource:0}: Error finding container 0db27be18d81911592f2da15e5a6d359f946d936226be71eef18591b4a232062: Status 404 returned error can't find the container with id 0db27be18d81911592f2da15e5a6d359f946d936226be71eef18591b4a232062 Apr 22 18:45:01.509330 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.509294 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll"] Apr 22 18:45:01.511818 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:45:01.511787 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa9079f_3ff2_48a6_b5ea_ce05a85873c6.slice/crio-68733a1522bfa66be27b1c34474d1d183217ea32fae38ae6f6f1459d9b0fa3dd WatchSource:0}: Error finding container 68733a1522bfa66be27b1c34474d1d183217ea32fae38ae6f6f1459d9b0fa3dd: Status 404 returned error can't find the container with id 68733a1522bfa66be27b1c34474d1d183217ea32fae38ae6f6f1459d9b0fa3dd Apr 22 18:45:01.513993 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.513760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" event={"ID":"f26530c0-159a-4f80-b8f8-2e7f87601759","Type":"ContainerStarted","Data":"6dd2901a7b3b5a16db38af948c33b7c561db2f2a1dc7edbbc25c6b82a7c33517"} Apr 22 18:45:01.513993 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.513799 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" event={"ID":"f26530c0-159a-4f80-b8f8-2e7f87601759","Type":"ContainerStarted","Data":"65e540fb40bc030e2ade4b51f38fcc71c291ecfb3d0246c3237ce77af83ab9b9"} Apr 22 18:45:01.515545 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.515504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" event={"ID":"531b04b5-62b1-477e-a856-1997183fb66b","Type":"ContainerStarted","Data":"c6af39e7215a1fab988333681a02717210a44e02f68296b8dbe02b9973465d5c"} Apr 22 18:45:01.515623 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.515557 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" event={"ID":"531b04b5-62b1-477e-a856-1997183fb66b","Type":"ContainerStarted","Data":"0db27be18d81911592f2da15e5a6d359f946d936226be71eef18591b4a232062"} Apr 22 18:45:01.526203 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:01.526168 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4"] Apr 22 18:45:01.614000 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:45:01.613968 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbb78d1d_21dc_4b1a_8eeb_e71b45d92861.slice/crio-cbe6015eff837b77f8359764c3905095bfa291c561e415c46e9b1a8b87774cb4 WatchSource:0}: Error finding container cbe6015eff837b77f8359764c3905095bfa291c561e415c46e9b1a8b87774cb4: Status 404 returned error can't find the container with id cbe6015eff837b77f8359764c3905095bfa291c561e415c46e9b1a8b87774cb4 Apr 22 18:45:02.520408 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.520371 2577 generic.go:358] "Generic (PLEG): container finished" podID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerID="d38688ecd2a2360e3cfb8cd29a6dfa1b4306169dec969c5f91b167882c2c67ca" exitCode=0 Apr 22 18:45:02.520847 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.520449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" event={"ID":"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6","Type":"ContainerDied","Data":"d38688ecd2a2360e3cfb8cd29a6dfa1b4306169dec969c5f91b167882c2c67ca"} Apr 22 18:45:02.520847 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.520485 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" event={"ID":"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6","Type":"ContainerStarted","Data":"68733a1522bfa66be27b1c34474d1d183217ea32fae38ae6f6f1459d9b0fa3dd"} Apr 22 18:45:02.521830 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.521805 2577 generic.go:358] "Generic (PLEG): container finished" podID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerID="6dd2901a7b3b5a16db38af948c33b7c561db2f2a1dc7edbbc25c6b82a7c33517" exitCode=0 Apr 22 18:45:02.521932 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.521893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" event={"ID":"f26530c0-159a-4f80-b8f8-2e7f87601759","Type":"ContainerDied","Data":"6dd2901a7b3b5a16db38af948c33b7c561db2f2a1dc7edbbc25c6b82a7c33517"} Apr 22 18:45:02.523348 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.523331 2577 generic.go:358] "Generic (PLEG): container finished" podID="531b04b5-62b1-477e-a856-1997183fb66b" containerID="c6af39e7215a1fab988333681a02717210a44e02f68296b8dbe02b9973465d5c" exitCode=0 Apr 22 18:45:02.523424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.523403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" event={"ID":"531b04b5-62b1-477e-a856-1997183fb66b","Type":"ContainerDied","Data":"c6af39e7215a1fab988333681a02717210a44e02f68296b8dbe02b9973465d5c"} Apr 22 18:45:02.524843 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.524795 2577 generic.go:358] "Generic (PLEG): container finished" podID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerID="55f7dd58b1308b5e6adceaddf46bc50b631d5fc5112fa18b5451a7ae3a86d6b5" exitCode=0 Apr 22 18:45:02.524950 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.524845 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" event={"ID":"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861","Type":"ContainerDied","Data":"55f7dd58b1308b5e6adceaddf46bc50b631d5fc5112fa18b5451a7ae3a86d6b5"} Apr 22 18:45:02.524950 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:02.524869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" event={"ID":"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861","Type":"ContainerStarted","Data":"cbe6015eff837b77f8359764c3905095bfa291c561e415c46e9b1a8b87774cb4"} Apr 22 18:45:04.390556 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:04.390525 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:45:04.390556 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:04.390563 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:45:04.395107 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:04.395087 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:45:04.536511 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:04.536484 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75cfdbbd69-4rwhn" Apr 22 18:45:06.541227 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:06.541190 2577 generic.go:358] "Generic (PLEG): container finished" podID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerID="3c3841e7fbe7a15af173a9553d09ca0d432d1dbd5a7e5203ef0f536b6a671962" exitCode=0 Apr 22 18:45:06.541632 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:06.541280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" event={"ID":"f26530c0-159a-4f80-b8f8-2e7f87601759","Type":"ContainerDied","Data":"3c3841e7fbe7a15af173a9553d09ca0d432d1dbd5a7e5203ef0f536b6a671962"} Apr 22 18:45:06.542938 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:06.542912 2577 generic.go:358] "Generic (PLEG): container finished" podID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerID="2143f1d51be7da8ef85a516ac97d254fc8017a8347641bbb1b8189d36dc794a8" exitCode=0 Apr 22 18:45:06.543020 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:06.542987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" event={"ID":"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861","Type":"ContainerDied","Data":"2143f1d51be7da8ef85a516ac97d254fc8017a8347641bbb1b8189d36dc794a8"} Apr 22 18:45:07.549649 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:07.549613 2577 generic.go:358] "Generic (PLEG): container finished" podID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerID="ef2c63cc1eb13daf6c0f919f3552c62cd52aa6bbe7387ccd5e107ccb3f63c714" exitCode=0 Apr 22 18:45:07.550100 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:07.549694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" event={"ID":"f26530c0-159a-4f80-b8f8-2e7f87601759","Type":"ContainerDied","Data":"ef2c63cc1eb13daf6c0f919f3552c62cd52aa6bbe7387ccd5e107ccb3f63c714"} Apr 22 18:45:07.551462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:07.551438 2577 generic.go:358] "Generic (PLEG): container finished" podID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerID="0fb69497de885a2ae7fa9ed7330aa977d9eaffc4074cf1db6e50fecf207f874c" exitCode=0 Apr 22 18:45:07.551608 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:07.551472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" event={"ID":"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861","Type":"ContainerDied","Data":"0fb69497de885a2ae7fa9ed7330aa977d9eaffc4074cf1db6e50fecf207f874c"} Apr 22 18:45:08.707765 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.707745 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:08.710929 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.710913 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:08.815029 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.815006 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-util\") pod \"f26530c0-159a-4f80-b8f8-2e7f87601759\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " Apr 22 18:45:08.815152 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.815050 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-bundle\") pod \"f26530c0-159a-4f80-b8f8-2e7f87601759\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " Apr 22 18:45:08.815152 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.815089 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-util\") pod \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " Apr 22 18:45:08.815152 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.815122 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vm48\" (UniqueName: \"kubernetes.io/projected/f26530c0-159a-4f80-b8f8-2e7f87601759-kube-api-access-7vm48\") pod \"f26530c0-159a-4f80-b8f8-2e7f87601759\" (UID: \"f26530c0-159a-4f80-b8f8-2e7f87601759\") " Apr 22 18:45:08.815312 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.815152 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-bundle\") pod \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " Apr 22 18:45:08.815312 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.815176 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn8nb\" (UniqueName: \"kubernetes.io/projected/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-kube-api-access-gn8nb\") pod \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\" (UID: \"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861\") " Apr 22 18:45:08.815675 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.815646 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-bundle" (OuterVolumeSpecName: "bundle") pod "f26530c0-159a-4f80-b8f8-2e7f87601759" (UID: "f26530c0-159a-4f80-b8f8-2e7f87601759"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:08.815820 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.815789 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-bundle" (OuterVolumeSpecName: "bundle") pod "dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" (UID: "dbb78d1d-21dc-4b1a-8eeb-e71b45d92861"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:08.817410 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.817335 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26530c0-159a-4f80-b8f8-2e7f87601759-kube-api-access-7vm48" (OuterVolumeSpecName: "kube-api-access-7vm48") pod "f26530c0-159a-4f80-b8f8-2e7f87601759" (UID: "f26530c0-159a-4f80-b8f8-2e7f87601759"). InnerVolumeSpecName "kube-api-access-7vm48". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:08.817782 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.817742 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-kube-api-access-gn8nb" (OuterVolumeSpecName: "kube-api-access-gn8nb") pod "dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" (UID: "dbb78d1d-21dc-4b1a-8eeb-e71b45d92861"). InnerVolumeSpecName "kube-api-access-gn8nb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:08.820134 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.820114 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-util" (OuterVolumeSpecName: "util") pod "dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" (UID: "dbb78d1d-21dc-4b1a-8eeb-e71b45d92861"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:08.821240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.821216 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-util" (OuterVolumeSpecName: "util") pod "f26530c0-159a-4f80-b8f8-2e7f87601759" (UID: "f26530c0-159a-4f80-b8f8-2e7f87601759"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:08.916464 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.916435 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:08.916555 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.916467 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f26530c0-159a-4f80-b8f8-2e7f87601759-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:08.916555 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.916480 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:08.916555 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.916494 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vm48\" (UniqueName: \"kubernetes.io/projected/f26530c0-159a-4f80-b8f8-2e7f87601759-kube-api-access-7vm48\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:08.916555 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.916509 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:08.916555 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:08.916523 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gn8nb\" (UniqueName: \"kubernetes.io/projected/dbb78d1d-21dc-4b1a-8eeb-e71b45d92861-kube-api-access-gn8nb\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:09.562216 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.562188 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" Apr 22 18:45:09.562384 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.562206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sf7z4" event={"ID":"dbb78d1d-21dc-4b1a-8eeb-e71b45d92861","Type":"ContainerDied","Data":"cbe6015eff837b77f8359764c3905095bfa291c561e415c46e9b1a8b87774cb4"} Apr 22 18:45:09.562384 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.562250 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbe6015eff837b77f8359764c3905095bfa291c561e415c46e9b1a8b87774cb4" Apr 22 18:45:09.563925 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.563901 2577 generic.go:358] "Generic (PLEG): container finished" podID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerID="ae01d57d096af4327e331c84d0ce36e33218f903e3b42542ddade417fe731f75" exitCode=0 Apr 22 18:45:09.564053 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.563973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" event={"ID":"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6","Type":"ContainerDied","Data":"ae01d57d096af4327e331c84d0ce36e33218f903e3b42542ddade417fe731f75"} Apr 22 18:45:09.566023 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.566005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" event={"ID":"f26530c0-159a-4f80-b8f8-2e7f87601759","Type":"ContainerDied","Data":"65e540fb40bc030e2ade4b51f38fcc71c291ecfb3d0246c3237ce77af83ab9b9"} Apr 22 18:45:09.566106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.566023 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88jh25r" Apr 22 18:45:09.566106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.566029 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65e540fb40bc030e2ade4b51f38fcc71c291ecfb3d0246c3237ce77af83ab9b9" Apr 22 18:45:09.567864 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.567841 2577 generic.go:358] "Generic (PLEG): container finished" podID="531b04b5-62b1-477e-a856-1997183fb66b" containerID="dc10e844f427cd4f3d7900de1f5dae3490ecf4566af7f2bfa88aa54f52e00fe5" exitCode=0 Apr 22 18:45:09.568067 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:09.567871 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" event={"ID":"531b04b5-62b1-477e-a856-1997183fb66b","Type":"ContainerDied","Data":"dc10e844f427cd4f3d7900de1f5dae3490ecf4566af7f2bfa88aa54f52e00fe5"} Apr 22 18:45:10.573224 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:10.573193 2577 generic.go:358] "Generic (PLEG): container finished" podID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerID="3c8a2efb7d0040301114896af414b198d47939adc050795714eabcd1cfd77fd0" exitCode=0 Apr 22 18:45:10.573613 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:10.573262 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" event={"ID":"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6","Type":"ContainerDied","Data":"3c8a2efb7d0040301114896af414b198d47939adc050795714eabcd1cfd77fd0"} Apr 22 18:45:10.574926 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:10.574905 2577 generic.go:358] "Generic (PLEG): container finished" podID="531b04b5-62b1-477e-a856-1997183fb66b" containerID="d562be759f87b37eaf33ae39f9eb2619f8a7a5f07302a0f7666b5d2bddbfd309" exitCode=0 Apr 22 18:45:10.575032 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:10.574957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" event={"ID":"531b04b5-62b1-477e-a856-1997183fb66b","Type":"ContainerDied","Data":"d562be759f87b37eaf33ae39f9eb2619f8a7a5f07302a0f7666b5d2bddbfd309"} Apr 22 18:45:11.708840 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.708819 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:11.735121 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.735102 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:11.738887 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.738868 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-bundle\") pod \"531b04b5-62b1-477e-a856-1997183fb66b\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " Apr 22 18:45:11.738973 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.738930 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjm8z\" (UniqueName: \"kubernetes.io/projected/531b04b5-62b1-477e-a856-1997183fb66b-kube-api-access-gjm8z\") pod \"531b04b5-62b1-477e-a856-1997183fb66b\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " Apr 22 18:45:11.739038 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.738985 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-util\") pod \"531b04b5-62b1-477e-a856-1997183fb66b\" (UID: \"531b04b5-62b1-477e-a856-1997183fb66b\") " Apr 22 18:45:11.739358 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.739325 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-bundle" (OuterVolumeSpecName: "bundle") pod "531b04b5-62b1-477e-a856-1997183fb66b" (UID: "531b04b5-62b1-477e-a856-1997183fb66b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:11.741162 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.741138 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/531b04b5-62b1-477e-a856-1997183fb66b-kube-api-access-gjm8z" (OuterVolumeSpecName: "kube-api-access-gjm8z") pod "531b04b5-62b1-477e-a856-1997183fb66b" (UID: "531b04b5-62b1-477e-a856-1997183fb66b"). InnerVolumeSpecName "kube-api-access-gjm8z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:11.746247 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.746220 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-util" (OuterVolumeSpecName: "util") pod "531b04b5-62b1-477e-a856-1997183fb66b" (UID: "531b04b5-62b1-477e-a856-1997183fb66b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:11.839489 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.839418 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-bundle\") pod \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " Apr 22 18:45:11.839489 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.839481 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-util\") pod \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " Apr 22 18:45:11.839683 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.839518 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4rcr\" (UniqueName: \"kubernetes.io/projected/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-kube-api-access-l4rcr\") pod \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\" (UID: \"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6\") " Apr 22 18:45:11.839786 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.839771 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:11.839830 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.839795 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/531b04b5-62b1-477e-a856-1997183fb66b-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:11.839830 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.839810 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjm8z\" (UniqueName: \"kubernetes.io/projected/531b04b5-62b1-477e-a856-1997183fb66b-kube-api-access-gjm8z\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:11.840115 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.840089 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-bundle" (OuterVolumeSpecName: "bundle") pod "7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" (UID: "7aa9079f-3ff2-48a6-b5ea-ce05a85873c6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:11.841524 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.841503 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-kube-api-access-l4rcr" (OuterVolumeSpecName: "kube-api-access-l4rcr") pod "7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" (UID: "7aa9079f-3ff2-48a6-b5ea-ce05a85873c6"). InnerVolumeSpecName "kube-api-access-l4rcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:11.843631 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.843600 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-util" (OuterVolumeSpecName: "util") pod "7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" (UID: "7aa9079f-3ff2-48a6-b5ea-ce05a85873c6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:11.941110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.941072 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-util\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:11.941110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.941109 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l4rcr\" (UniqueName: \"kubernetes.io/projected/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-kube-api-access-l4rcr\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:11.941110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:11.941120 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7aa9079f-3ff2-48a6-b5ea-ce05a85873c6-bundle\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:45:12.590404 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:12.590368 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" event={"ID":"7aa9079f-3ff2-48a6-b5ea-ce05a85873c6","Type":"ContainerDied","Data":"68733a1522bfa66be27b1c34474d1d183217ea32fae38ae6f6f1459d9b0fa3dd"} Apr 22 18:45:12.590404 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:12.590389 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503dvpll" Apr 22 18:45:12.590404 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:12.590401 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68733a1522bfa66be27b1c34474d1d183217ea32fae38ae6f6f1459d9b0fa3dd" Apr 22 18:45:12.592027 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:12.592004 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" event={"ID":"531b04b5-62b1-477e-a856-1997183fb66b","Type":"ContainerDied","Data":"0db27be18d81911592f2da15e5a6d359f946d936226be71eef18591b4a232062"} Apr 22 18:45:12.592027 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:12.592026 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0db27be18d81911592f2da15e5a6d359f946d936226be71eef18591b4a232062" Apr 22 18:45:12.592335 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:12.592071 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bzxwpj" Apr 22 18:45:18.115550 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115505 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg"] Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115927 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerName="pull" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115940 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerName="pull" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115950 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerName="pull" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115956 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerName="pull" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115964 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerName="extract" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115969 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerName="extract" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115979 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="531b04b5-62b1-477e-a856-1997183fb66b" containerName="extract" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115984 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="531b04b5-62b1-477e-a856-1997183fb66b" containerName="extract" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115991 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="531b04b5-62b1-477e-a856-1997183fb66b" containerName="util" Apr 22 18:45:18.115990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.115997 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="531b04b5-62b1-477e-a856-1997183fb66b" containerName="util" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116004 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerName="extract" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116009 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerName="extract" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116016 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="531b04b5-62b1-477e-a856-1997183fb66b" containerName="pull" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116021 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="531b04b5-62b1-477e-a856-1997183fb66b" containerName="pull" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116031 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerName="pull" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116036 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerName="pull" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116043 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerName="util" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116048 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerName="util" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116053 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerName="extract" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116058 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerName="extract" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116069 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerName="util" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116075 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerName="util" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116083 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerName="util" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116088 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerName="util" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116151 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="531b04b5-62b1-477e-a856-1997183fb66b" containerName="extract" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116158 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbb78d1d-21dc-4b1a-8eeb-e71b45d92861" containerName="extract" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116166 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7aa9079f-3ff2-48a6-b5ea-ce05a85873c6" containerName="extract" Apr 22 18:45:18.116279 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.116174 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f26530c0-159a-4f80-b8f8-2e7f87601759" containerName="extract" Apr 22 18:45:18.143406 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.143381 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg"] Apr 22 18:45:18.143555 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.143491 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" Apr 22 18:45:18.146070 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.146047 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:45:18.146201 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.146084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:45:18.146201 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.146140 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-nfvcs\"" Apr 22 18:45:18.193972 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.193943 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c79gx\" (UniqueName: \"kubernetes.io/projected/5e57758f-365e-4543-882d-4a85198dc967-kube-api-access-c79gx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-nfsvg\" (UID: \"5e57758f-365e-4543-882d-4a85198dc967\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" Apr 22 18:45:18.294702 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.294672 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c79gx\" (UniqueName: \"kubernetes.io/projected/5e57758f-365e-4543-882d-4a85198dc967-kube-api-access-c79gx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-nfsvg\" (UID: \"5e57758f-365e-4543-882d-4a85198dc967\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" Apr 22 18:45:18.303249 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.303227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c79gx\" (UniqueName: \"kubernetes.io/projected/5e57758f-365e-4543-882d-4a85198dc967-kube-api-access-c79gx\") pod \"limitador-operator-controller-manager-c7fb4c8d5-nfsvg\" (UID: \"5e57758f-365e-4543-882d-4a85198dc967\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" Apr 22 18:45:18.454358 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.454328 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" Apr 22 18:45:18.587901 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.587831 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg"] Apr 22 18:45:18.590642 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:45:18.590612 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e57758f_365e_4543_882d_4a85198dc967.slice/crio-a8895218ab1d665bd897ab34445f77cb821511c2421f34d72ef706daad759850 WatchSource:0}: Error finding container a8895218ab1d665bd897ab34445f77cb821511c2421f34d72ef706daad759850: Status 404 returned error can't find the container with id a8895218ab1d665bd897ab34445f77cb821511c2421f34d72ef706daad759850 Apr 22 18:45:18.618355 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:18.618320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" event={"ID":"5e57758f-365e-4543-882d-4a85198dc967","Type":"ContainerStarted","Data":"a8895218ab1d665bd897ab34445f77cb821511c2421f34d72ef706daad759850"} Apr 22 18:45:22.648728 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:22.648686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" event={"ID":"5e57758f-365e-4543-882d-4a85198dc967","Type":"ContainerStarted","Data":"31f4687188331777f07644a2d8085f522b3b02276adb5d0ef05400a0ff0bcdee"} Apr 22 18:45:22.649117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:22.648883 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" Apr 22 18:45:22.669857 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:22.669809 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" podStartSLOduration=1.328673625 podStartE2EDuration="4.669796231s" podCreationTimestamp="2026-04-22 18:45:18 +0000 UTC" firstStartedPulling="2026-04-22 18:45:18.592613927 +0000 UTC m=+454.231139028" lastFinishedPulling="2026-04-22 18:45:21.933736525 +0000 UTC m=+457.572261634" observedRunningTime="2026-04-22 18:45:22.667797109 +0000 UTC m=+458.306322229" watchObservedRunningTime="2026-04-22 18:45:22.669796231 +0000 UTC m=+458.308321352" Apr 22 18:45:26.350135 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.350096 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-bfb4t"] Apr 22 18:45:26.352566 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.352549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" Apr 22 18:45:26.354691 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.354661 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-nbqgm\"" Apr 22 18:45:26.364152 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.364127 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-bfb4t"] Apr 22 18:45:26.463997 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.463966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hrb8\" (UniqueName: \"kubernetes.io/projected/f12f3369-dec0-4542-bbaa-83a886a9fb9f-kube-api-access-5hrb8\") pod \"authorino-operator-7587b89b76-bfb4t\" (UID: \"f12f3369-dec0-4542-bbaa-83a886a9fb9f\") " pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" Apr 22 18:45:26.565593 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.565552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hrb8\" (UniqueName: \"kubernetes.io/projected/f12f3369-dec0-4542-bbaa-83a886a9fb9f-kube-api-access-5hrb8\") pod \"authorino-operator-7587b89b76-bfb4t\" (UID: \"f12f3369-dec0-4542-bbaa-83a886a9fb9f\") " pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" Apr 22 18:45:26.574111 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.574088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hrb8\" (UniqueName: \"kubernetes.io/projected/f12f3369-dec0-4542-bbaa-83a886a9fb9f-kube-api-access-5hrb8\") pod \"authorino-operator-7587b89b76-bfb4t\" (UID: \"f12f3369-dec0-4542-bbaa-83a886a9fb9f\") " pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" Apr 22 18:45:26.664086 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.664049 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" Apr 22 18:45:26.799334 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:26.799308 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-bfb4t"] Apr 22 18:45:26.801202 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:45:26.801172 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf12f3369_dec0_4542_bbaa_83a886a9fb9f.slice/crio-d8100dfce93a020eb4d2ee2c7d62e128cfaf19c8769caaea1adad13bb2064387 WatchSource:0}: Error finding container d8100dfce93a020eb4d2ee2c7d62e128cfaf19c8769caaea1adad13bb2064387: Status 404 returned error can't find the container with id d8100dfce93a020eb4d2ee2c7d62e128cfaf19c8769caaea1adad13bb2064387 Apr 22 18:45:27.670989 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:27.670959 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" event={"ID":"f12f3369-dec0-4542-bbaa-83a886a9fb9f","Type":"ContainerStarted","Data":"d8100dfce93a020eb4d2ee2c7d62e128cfaf19c8769caaea1adad13bb2064387"} Apr 22 18:45:28.675976 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:28.675939 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" event={"ID":"f12f3369-dec0-4542-bbaa-83a886a9fb9f","Type":"ContainerStarted","Data":"2bdb2ed9cebcbcfbed29c96836d44249615764781adc672c0511134516019969"} Apr 22 18:45:28.676403 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:28.676052 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" Apr 22 18:45:28.694294 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:28.694236 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" podStartSLOduration=1.37359273 podStartE2EDuration="2.694219453s" podCreationTimestamp="2026-04-22 18:45:26 +0000 UTC" firstStartedPulling="2026-04-22 18:45:26.803306524 +0000 UTC m=+462.441831624" lastFinishedPulling="2026-04-22 18:45:28.123933234 +0000 UTC m=+463.762458347" observedRunningTime="2026-04-22 18:45:28.691701116 +0000 UTC m=+464.330226228" watchObservedRunningTime="2026-04-22 18:45:28.694219453 +0000 UTC m=+464.332744576" Apr 22 18:45:33.656112 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:33.656081 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-nfsvg" Apr 22 18:45:39.681493 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:45:39.681460 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-bfb4t" Apr 22 18:46:51.544846 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.544769 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj"] Apr 22 18:46:51.548891 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.548870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.561119 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.561099 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj"] Apr 22 18:46:51.679416 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.679383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.679603 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.679425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmk5w\" (UniqueName: \"kubernetes.io/projected/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-kube-api-access-qmk5w\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.679603 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.679460 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.679603 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.679532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.679603 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.679565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.679783 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.679609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.679783 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.679652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.781033 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.780997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.781033 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.781038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.781240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.781101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.781240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.781132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.781240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.781168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.781240 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.781201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmk5w\" (UniqueName: \"kubernetes.io/projected/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-kube-api-access-qmk5w\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.781417 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.781252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.782073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.782040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.783670 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.783643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.783781 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.783680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.783781 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.783692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.783965 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.783946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.789884 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.789861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmk5w\" (UniqueName: \"kubernetes.io/projected/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-kube-api-access-qmk5w\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.790295 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.790275 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a2ced0b6-019b-444f-b30c-8c2a55bbe5de-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-gq2nj\" (UID: \"a2ced0b6-019b-444f-b30c-8c2a55bbe5de\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:51.860441 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:51.860348 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:52.208243 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:52.208221 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj"] Apr 22 18:46:52.210306 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:46:52.210279 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2ced0b6_019b_444f_b30c_8c2a55bbe5de.slice/crio-3020ebb149df6eb923d75c8f97149c0d0485c6544c899b89e836be83c933608e WatchSource:0}: Error finding container 3020ebb149df6eb923d75c8f97149c0d0485c6544c899b89e836be83c933608e: Status 404 returned error can't find the container with id 3020ebb149df6eb923d75c8f97149c0d0485c6544c899b89e836be83c933608e Apr 22 18:46:52.212737 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:52.212682 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:46:52.212840 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:52.212766 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:46:52.997427 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:52.997393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" event={"ID":"a2ced0b6-019b-444f-b30c-8c2a55bbe5de","Type":"ContainerStarted","Data":"dbb6af905e27a6f8100cd387ab96c81e1eba81580201dc5736fcde9fb098d1c9"} Apr 22 18:46:52.997427 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:52.997430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" event={"ID":"a2ced0b6-019b-444f-b30c-8c2a55bbe5de","Type":"ContainerStarted","Data":"3020ebb149df6eb923d75c8f97149c0d0485c6544c899b89e836be83c933608e"} Apr 22 18:46:52.997953 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:52.997532 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:53.020864 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:53.020799 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" podStartSLOduration=2.020781766 podStartE2EDuration="2.020781766s" podCreationTimestamp="2026-04-22 18:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:53.018086093 +0000 UTC m=+548.656611208" watchObservedRunningTime="2026-04-22 18:46:53.020781766 +0000 UTC m=+548.659306889" Apr 22 18:46:54.003026 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.002972 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-gq2nj" Apr 22 18:46:54.057665 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.057633 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr"] Apr 22 18:46:54.057898 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.057878 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" podUID="e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" containerName="discovery" containerID="cri-o://74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd" gracePeriod=30 Apr 22 18:46:54.303124 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.303104 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:46:54.408369 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.408341 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-dns-cert\") pod \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " Apr 22 18:46:54.408369 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.408374 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-local-certs\") pod \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " Apr 22 18:46:54.408600 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.408411 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-cacerts\") pod \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " Apr 22 18:46:54.408600 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.408437 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-token\") pod \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " Apr 22 18:46:54.408600 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.408473 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dq4q\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-kube-api-access-9dq4q\") pod \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " Apr 22 18:46:54.408600 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.408552 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-ca-configmap\") pod \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " Apr 22 18:46:54.408834 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.408703 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-kubeconfig\") pod \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\" (UID: \"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc\") " Apr 22 18:46:54.408958 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.408931 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" (UID: "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:54.409396 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.409372 2577 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-ca-configmap\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.411018 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.410981 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-kube-api-access-9dq4q" (OuterVolumeSpecName: "kube-api-access-9dq4q") pod "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" (UID: "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc"). InnerVolumeSpecName "kube-api-access-9dq4q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:54.411125 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.411058 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" (UID: "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.411230 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.411202 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-token" (OuterVolumeSpecName: "istio-token") pod "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" (UID: "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:54.411333 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.411247 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-cacerts" (OuterVolumeSpecName: "cacerts") pod "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" (UID: "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.411333 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.411249 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-local-certs" (OuterVolumeSpecName: "local-certs") pod "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" (UID: "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:46:54.411448 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.411430 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" (UID: "e0b8e9e1-cfdb-4277-83c6-442ca4e761dc"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.509976 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.509940 2577 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-csr-dns-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.509976 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.509969 2577 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-local-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.509976 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.509979 2577 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-cacerts\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.510205 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.509987 2577 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-token\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.510205 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.509995 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9dq4q\" (UniqueName: \"kubernetes.io/projected/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-kube-api-access-9dq4q\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.510205 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:54.510005 2577 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc-istio-kubeconfig\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:46:55.005278 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.005240 2577 generic.go:358] "Generic (PLEG): container finished" podID="e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" containerID="74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd" exitCode=0 Apr 22 18:46:55.005755 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.005295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" event={"ID":"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc","Type":"ContainerDied","Data":"74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd"} Apr 22 18:46:55.005755 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.005311 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" Apr 22 18:46:55.005755 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.005341 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr" event={"ID":"e0b8e9e1-cfdb-4277-83c6-442ca4e761dc","Type":"ContainerDied","Data":"02863ee48e1c1661ea77023b32d9a56f3e01c8be36340b1d65c3849e5ef093f4"} Apr 22 18:46:55.005755 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.005370 2577 scope.go:117] "RemoveContainer" containerID="74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd" Apr 22 18:46:55.017704 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.017686 2577 scope.go:117] "RemoveContainer" containerID="74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd" Apr 22 18:46:55.017974 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:46:55.017955 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd\": container with ID starting with 74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd not found: ID does not exist" containerID="74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd" Apr 22 18:46:55.018039 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.017981 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd"} err="failed to get container status \"74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd\": rpc error: code = NotFound desc = could not find container \"74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd\": container with ID starting with 74f4da6d3429bdc3b1bafe00060d791459dc034b31fe74e47567a64d463c10fd not found: ID does not exist" Apr 22 18:46:55.026542 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.026508 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr"] Apr 22 18:46:55.038181 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:55.038155 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-2k4tr"] Apr 22 18:46:56.885647 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:56.885617 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" path="/var/lib/kubelet/pods/e0b8e9e1-cfdb-4277-83c6-442ca4e761dc/volumes" Apr 22 18:46:59.986363 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.986329 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qjx4r"] Apr 22 18:46:59.986709 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.986687 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" containerName="discovery" Apr 22 18:46:59.986709 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.986699 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" containerName="discovery" Apr 22 18:46:59.986799 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.986787 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0b8e9e1-cfdb-4277-83c6-442ca4e761dc" containerName="discovery" Apr 22 18:46:59.988568 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.988550 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:46:59.990878 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.990859 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:46:59.991871 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.991851 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-cmnmc\"" Apr 22 18:46:59.991987 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.991894 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:46:59.991987 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.991918 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:46:59.997889 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:46:59.997869 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qjx4r"] Apr 22 18:47:00.033365 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.033340 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-l9gzq"] Apr 22 18:47:00.035644 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.035615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:00.038238 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.038216 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:47:00.038891 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.038870 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-p9zql\"" Apr 22 18:47:00.045257 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.045235 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-l9gzq"] Apr 22 18:47:00.066190 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.066160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfql\" (UniqueName: \"kubernetes.io/projected/94c63f24-7b2d-44e8-82a4-a252351436c5-kube-api-access-5dfql\") pod \"kserve-controller-manager-d9c56dd68-qjx4r\" (UID: \"94c63f24-7b2d-44e8-82a4-a252351436c5\") " pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:00.066285 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.066200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cee16190-fc1d-4bc1-a275-a6c0fe5e8563-data\") pod \"seaweedfs-86cc847c5c-l9gzq\" (UID: \"cee16190-fc1d-4bc1-a275-a6c0fe5e8563\") " pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:00.066285 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.066227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxfnr\" (UniqueName: \"kubernetes.io/projected/cee16190-fc1d-4bc1-a275-a6c0fe5e8563-kube-api-access-cxfnr\") pod \"seaweedfs-86cc847c5c-l9gzq\" (UID: \"cee16190-fc1d-4bc1-a275-a6c0fe5e8563\") " pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:00.066361 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.066286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94c63f24-7b2d-44e8-82a4-a252351436c5-cert\") pod \"kserve-controller-manager-d9c56dd68-qjx4r\" (UID: \"94c63f24-7b2d-44e8-82a4-a252351436c5\") " pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:00.167645 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.167612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfql\" (UniqueName: \"kubernetes.io/projected/94c63f24-7b2d-44e8-82a4-a252351436c5-kube-api-access-5dfql\") pod \"kserve-controller-manager-d9c56dd68-qjx4r\" (UID: \"94c63f24-7b2d-44e8-82a4-a252351436c5\") " pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:00.167645 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.167646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cee16190-fc1d-4bc1-a275-a6c0fe5e8563-data\") pod \"seaweedfs-86cc847c5c-l9gzq\" (UID: \"cee16190-fc1d-4bc1-a275-a6c0fe5e8563\") " pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:00.167870 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.167663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxfnr\" (UniqueName: \"kubernetes.io/projected/cee16190-fc1d-4bc1-a275-a6c0fe5e8563-kube-api-access-cxfnr\") pod \"seaweedfs-86cc847c5c-l9gzq\" (UID: \"cee16190-fc1d-4bc1-a275-a6c0fe5e8563\") " pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:00.167870 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.167691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94c63f24-7b2d-44e8-82a4-a252351436c5-cert\") pod \"kserve-controller-manager-d9c56dd68-qjx4r\" (UID: \"94c63f24-7b2d-44e8-82a4-a252351436c5\") " pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:00.168052 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.168031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cee16190-fc1d-4bc1-a275-a6c0fe5e8563-data\") pod \"seaweedfs-86cc847c5c-l9gzq\" (UID: \"cee16190-fc1d-4bc1-a275-a6c0fe5e8563\") " pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:00.170106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.170080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94c63f24-7b2d-44e8-82a4-a252351436c5-cert\") pod \"kserve-controller-manager-d9c56dd68-qjx4r\" (UID: \"94c63f24-7b2d-44e8-82a4-a252351436c5\") " pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:00.176544 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.176524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxfnr\" (UniqueName: \"kubernetes.io/projected/cee16190-fc1d-4bc1-a275-a6c0fe5e8563-kube-api-access-cxfnr\") pod \"seaweedfs-86cc847c5c-l9gzq\" (UID: \"cee16190-fc1d-4bc1-a275-a6c0fe5e8563\") " pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:00.176780 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.176612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfql\" (UniqueName: \"kubernetes.io/projected/94c63f24-7b2d-44e8-82a4-a252351436c5-kube-api-access-5dfql\") pod \"kserve-controller-manager-d9c56dd68-qjx4r\" (UID: \"94c63f24-7b2d-44e8-82a4-a252351436c5\") " pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:00.299867 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.299782 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:00.346930 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.346327 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:00.430611 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.430585 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qjx4r"] Apr 22 18:47:00.431413 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:47:00.431374 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94c63f24_7b2d_44e8_82a4_a252351436c5.slice/crio-9b6bf867ec49cb9366aaa7fc8ce12afbd87da0ad7f6060bd46f3d296ca557a28 WatchSource:0}: Error finding container 9b6bf867ec49cb9366aaa7fc8ce12afbd87da0ad7f6060bd46f3d296ca557a28: Status 404 returned error can't find the container with id 9b6bf867ec49cb9366aaa7fc8ce12afbd87da0ad7f6060bd46f3d296ca557a28 Apr 22 18:47:00.482505 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:00.482483 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-l9gzq"] Apr 22 18:47:00.483988 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:47:00.483969 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee16190_fc1d_4bc1_a275_a6c0fe5e8563.slice/crio-07cf9f04f62f88a0b20772d032c8d138dccfba03128573a52c94456ac995ebd6 WatchSource:0}: Error finding container 07cf9f04f62f88a0b20772d032c8d138dccfba03128573a52c94456ac995ebd6: Status 404 returned error can't find the container with id 07cf9f04f62f88a0b20772d032c8d138dccfba03128573a52c94456ac995ebd6 Apr 22 18:47:01.030868 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:01.030829 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" event={"ID":"94c63f24-7b2d-44e8-82a4-a252351436c5","Type":"ContainerStarted","Data":"9b6bf867ec49cb9366aaa7fc8ce12afbd87da0ad7f6060bd46f3d296ca557a28"} Apr 22 18:47:01.032172 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:01.032140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-l9gzq" event={"ID":"cee16190-fc1d-4bc1-a275-a6c0fe5e8563","Type":"ContainerStarted","Data":"07cf9f04f62f88a0b20772d032c8d138dccfba03128573a52c94456ac995ebd6"} Apr 22 18:47:05.053759 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:05.053698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-l9gzq" event={"ID":"cee16190-fc1d-4bc1-a275-a6c0fe5e8563","Type":"ContainerStarted","Data":"9112f8c2bf820a8cf808e7bef2319cbe9b8a0139f94d01b12ceb0387caf93d91"} Apr 22 18:47:05.054194 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:05.053769 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:05.055020 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:05.054995 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" event={"ID":"94c63f24-7b2d-44e8-82a4-a252351436c5","Type":"ContainerStarted","Data":"4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af"} Apr 22 18:47:05.055119 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:05.055040 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:05.071874 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:05.071831 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-l9gzq" podStartSLOduration=1.328611213 podStartE2EDuration="5.071818282s" podCreationTimestamp="2026-04-22 18:47:00 +0000 UTC" firstStartedPulling="2026-04-22 18:47:00.485256739 +0000 UTC m=+556.123781838" lastFinishedPulling="2026-04-22 18:47:04.228463804 +0000 UTC m=+559.866988907" observedRunningTime="2026-04-22 18:47:05.069287505 +0000 UTC m=+560.707812626" watchObservedRunningTime="2026-04-22 18:47:05.071818282 +0000 UTC m=+560.710343407" Apr 22 18:47:05.085031 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:05.084985 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" podStartSLOduration=2.388213219 podStartE2EDuration="6.084974176s" podCreationTimestamp="2026-04-22 18:46:59 +0000 UTC" firstStartedPulling="2026-04-22 18:47:00.432828324 +0000 UTC m=+556.071353424" lastFinishedPulling="2026-04-22 18:47:04.129589271 +0000 UTC m=+559.768114381" observedRunningTime="2026-04-22 18:47:05.084210661 +0000 UTC m=+560.722735786" watchObservedRunningTime="2026-04-22 18:47:05.084974176 +0000 UTC m=+560.723499329" Apr 22 18:47:11.061584 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:11.061551 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-l9gzq" Apr 22 18:47:36.064615 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:36.064584 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:36.832787 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:36.832752 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qjx4r"] Apr 22 18:47:36.832969 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:36.832947 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" podUID="94c63f24-7b2d-44e8-82a4-a252351436c5" containerName="manager" containerID="cri-o://4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af" gracePeriod=10 Apr 22 18:47:36.855925 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:36.855900 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-5sq2t"] Apr 22 18:47:36.859424 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:36.859409 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:36.867806 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:36.867784 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-5sq2t"] Apr 22 18:47:36.993484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:36.993439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce40ac3a-16fb-4e00-8be5-c55e202bd2d4-cert\") pod \"kserve-controller-manager-d9c56dd68-5sq2t\" (UID: \"ce40ac3a-16fb-4e00-8be5-c55e202bd2d4\") " pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:36.993666 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:36.993495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp875\" (UniqueName: \"kubernetes.io/projected/ce40ac3a-16fb-4e00-8be5-c55e202bd2d4-kube-api-access-jp875\") pod \"kserve-controller-manager-d9c56dd68-5sq2t\" (UID: \"ce40ac3a-16fb-4e00-8be5-c55e202bd2d4\") " pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:37.073414 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.073394 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:37.094085 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.094019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce40ac3a-16fb-4e00-8be5-c55e202bd2d4-cert\") pod \"kserve-controller-manager-d9c56dd68-5sq2t\" (UID: \"ce40ac3a-16fb-4e00-8be5-c55e202bd2d4\") " pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:37.094085 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.094074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp875\" (UniqueName: \"kubernetes.io/projected/ce40ac3a-16fb-4e00-8be5-c55e202bd2d4-kube-api-access-jp875\") pod \"kserve-controller-manager-d9c56dd68-5sq2t\" (UID: \"ce40ac3a-16fb-4e00-8be5-c55e202bd2d4\") " pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:37.102231 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.102208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp875\" (UniqueName: \"kubernetes.io/projected/ce40ac3a-16fb-4e00-8be5-c55e202bd2d4-kube-api-access-jp875\") pod \"kserve-controller-manager-d9c56dd68-5sq2t\" (UID: \"ce40ac3a-16fb-4e00-8be5-c55e202bd2d4\") " pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:37.103426 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.103405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ce40ac3a-16fb-4e00-8be5-c55e202bd2d4-cert\") pod \"kserve-controller-manager-d9c56dd68-5sq2t\" (UID: \"ce40ac3a-16fb-4e00-8be5-c55e202bd2d4\") " pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:37.177780 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.177749 2577 generic.go:358] "Generic (PLEG): container finished" podID="94c63f24-7b2d-44e8-82a4-a252351436c5" containerID="4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af" exitCode=0 Apr 22 18:47:37.177931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.177813 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" Apr 22 18:47:37.177931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.177844 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" event={"ID":"94c63f24-7b2d-44e8-82a4-a252351436c5","Type":"ContainerDied","Data":"4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af"} Apr 22 18:47:37.177931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.177885 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-qjx4r" event={"ID":"94c63f24-7b2d-44e8-82a4-a252351436c5","Type":"ContainerDied","Data":"9b6bf867ec49cb9366aaa7fc8ce12afbd87da0ad7f6060bd46f3d296ca557a28"} Apr 22 18:47:37.177931 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.177903 2577 scope.go:117] "RemoveContainer" containerID="4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af" Apr 22 18:47:37.188003 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.187969 2577 scope.go:117] "RemoveContainer" containerID="4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af" Apr 22 18:47:37.188444 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:47:37.188422 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af\": container with ID starting with 4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af not found: ID does not exist" containerID="4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af" Apr 22 18:47:37.188518 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.188455 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af"} err="failed to get container status \"4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af\": rpc error: code = NotFound desc = could not find container \"4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af\": container with ID starting with 4a22860c4678db1cb2dba26fd1a44bacff1895c95960e4d996ebf2f6309927af not found: ID does not exist" Apr 22 18:47:37.195236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.195216 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94c63f24-7b2d-44e8-82a4-a252351436c5-cert\") pod \"94c63f24-7b2d-44e8-82a4-a252351436c5\" (UID: \"94c63f24-7b2d-44e8-82a4-a252351436c5\") " Apr 22 18:47:37.195320 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.195286 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dfql\" (UniqueName: \"kubernetes.io/projected/94c63f24-7b2d-44e8-82a4-a252351436c5-kube-api-access-5dfql\") pod \"94c63f24-7b2d-44e8-82a4-a252351436c5\" (UID: \"94c63f24-7b2d-44e8-82a4-a252351436c5\") " Apr 22 18:47:37.197133 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.197108 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c63f24-7b2d-44e8-82a4-a252351436c5-cert" (OuterVolumeSpecName: "cert") pod "94c63f24-7b2d-44e8-82a4-a252351436c5" (UID: "94c63f24-7b2d-44e8-82a4-a252351436c5"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:37.197217 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.197171 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c63f24-7b2d-44e8-82a4-a252351436c5-kube-api-access-5dfql" (OuterVolumeSpecName: "kube-api-access-5dfql") pod "94c63f24-7b2d-44e8-82a4-a252351436c5" (UID: "94c63f24-7b2d-44e8-82a4-a252351436c5"). InnerVolumeSpecName "kube-api-access-5dfql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:37.220511 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.220479 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:37.296485 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.296449 2577 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94c63f24-7b2d-44e8-82a4-a252351436c5-cert\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:47:37.296629 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.296490 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dfql\" (UniqueName: \"kubernetes.io/projected/94c63f24-7b2d-44e8-82a4-a252351436c5-kube-api-access-5dfql\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:47:37.339356 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.339332 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-5sq2t"] Apr 22 18:47:37.341398 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:47:37.341373 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce40ac3a_16fb_4e00_8be5_c55e202bd2d4.slice/crio-567de06c2ddb33f392376a201190d86adacee027bd181d1b1d0c1a4253823997 WatchSource:0}: Error finding container 567de06c2ddb33f392376a201190d86adacee027bd181d1b1d0c1a4253823997: Status 404 returned error can't find the container with id 567de06c2ddb33f392376a201190d86adacee027bd181d1b1d0c1a4253823997 Apr 22 18:47:37.499783 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.499757 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qjx4r"] Apr 22 18:47:37.503205 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:37.503181 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qjx4r"] Apr 22 18:47:38.184207 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:38.184176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" event={"ID":"ce40ac3a-16fb-4e00-8be5-c55e202bd2d4","Type":"ContainerStarted","Data":"9b7cc8e7076b5cf096eefbf356a3b4bea85302036ef758bba129ddbd4d47e0c4"} Apr 22 18:47:38.184207 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:38.184209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" event={"ID":"ce40ac3a-16fb-4e00-8be5-c55e202bd2d4","Type":"ContainerStarted","Data":"567de06c2ddb33f392376a201190d86adacee027bd181d1b1d0c1a4253823997"} Apr 22 18:47:38.184628 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:38.184305 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:47:38.200612 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:38.200571 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" podStartSLOduration=1.728428501 podStartE2EDuration="2.20055803s" podCreationTimestamp="2026-04-22 18:47:36 +0000 UTC" firstStartedPulling="2026-04-22 18:47:37.342627121 +0000 UTC m=+592.981152222" lastFinishedPulling="2026-04-22 18:47:37.814756641 +0000 UTC m=+593.453281751" observedRunningTime="2026-04-22 18:47:38.19944764 +0000 UTC m=+593.837972760" watchObservedRunningTime="2026-04-22 18:47:38.20055803 +0000 UTC m=+593.839083152" Apr 22 18:47:38.884342 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:47:38.884309 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c63f24-7b2d-44e8-82a4-a252351436c5" path="/var/lib/kubelet/pods/94c63f24-7b2d-44e8-82a4-a252351436c5/volumes" Apr 22 18:48:09.195392 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:09.195313 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-d9c56dd68-5sq2t" Apr 22 18:48:10.120952 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.120918 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-hn5dl"] Apr 22 18:48:10.121342 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.121326 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94c63f24-7b2d-44e8-82a4-a252351436c5" containerName="manager" Apr 22 18:48:10.121392 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.121344 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c63f24-7b2d-44e8-82a4-a252351436c5" containerName="manager" Apr 22 18:48:10.121427 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.121409 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="94c63f24-7b2d-44e8-82a4-a252351436c5" containerName="manager" Apr 22 18:48:10.124418 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.124398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:10.126675 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.126650 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:48:10.126825 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.126655 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-v8tqp\"" Apr 22 18:48:10.133106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.133081 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-hn5dl"] Apr 22 18:48:10.163617 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.163589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8zd7\" (UniqueName: \"kubernetes.io/projected/6cc591d9-7303-4604-8c02-f07fa71d9e40-kube-api-access-w8zd7\") pod \"odh-model-controller-696fc77849-hn5dl\" (UID: \"6cc591d9-7303-4604-8c02-f07fa71d9e40\") " pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:10.163772 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.163637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cc591d9-7303-4604-8c02-f07fa71d9e40-cert\") pod \"odh-model-controller-696fc77849-hn5dl\" (UID: \"6cc591d9-7303-4604-8c02-f07fa71d9e40\") " pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:10.264690 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.264656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cc591d9-7303-4604-8c02-f07fa71d9e40-cert\") pod \"odh-model-controller-696fc77849-hn5dl\" (UID: \"6cc591d9-7303-4604-8c02-f07fa71d9e40\") " pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:10.265088 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.264806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8zd7\" (UniqueName: \"kubernetes.io/projected/6cc591d9-7303-4604-8c02-f07fa71d9e40-kube-api-access-w8zd7\") pod \"odh-model-controller-696fc77849-hn5dl\" (UID: \"6cc591d9-7303-4604-8c02-f07fa71d9e40\") " pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:10.265088 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:48:10.264836 2577 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:48:10.265088 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:48:10.264904 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cc591d9-7303-4604-8c02-f07fa71d9e40-cert podName:6cc591d9-7303-4604-8c02-f07fa71d9e40 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:10.76488348 +0000 UTC m=+626.403408601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6cc591d9-7303-4604-8c02-f07fa71d9e40-cert") pod "odh-model-controller-696fc77849-hn5dl" (UID: "6cc591d9-7303-4604-8c02-f07fa71d9e40") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:48:10.273249 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.273225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8zd7\" (UniqueName: \"kubernetes.io/projected/6cc591d9-7303-4604-8c02-f07fa71d9e40-kube-api-access-w8zd7\") pod \"odh-model-controller-696fc77849-hn5dl\" (UID: \"6cc591d9-7303-4604-8c02-f07fa71d9e40\") " pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:10.769285 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.769244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cc591d9-7303-4604-8c02-f07fa71d9e40-cert\") pod \"odh-model-controller-696fc77849-hn5dl\" (UID: \"6cc591d9-7303-4604-8c02-f07fa71d9e40\") " pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:10.771636 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:10.771613 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cc591d9-7303-4604-8c02-f07fa71d9e40-cert\") pod \"odh-model-controller-696fc77849-hn5dl\" (UID: \"6cc591d9-7303-4604-8c02-f07fa71d9e40\") " pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:11.034892 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:11.034812 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:11.164464 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:11.164415 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-hn5dl"] Apr 22 18:48:11.313182 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:11.313096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-hn5dl" event={"ID":"6cc591d9-7303-4604-8c02-f07fa71d9e40","Type":"ContainerStarted","Data":"dbc5fc13b985672ecf398dee62b4b2dfb49c4eefa76706095a2c77edd01666f9"} Apr 22 18:48:14.331272 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:14.331233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-hn5dl" event={"ID":"6cc591d9-7303-4604-8c02-f07fa71d9e40","Type":"ContainerStarted","Data":"c886da72fb0976a152be1f82cb201db086995789504b2acefddd6312c3938b30"} Apr 22 18:48:14.331893 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:14.331360 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:48:14.348505 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:14.348452 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-hn5dl" podStartSLOduration=1.738944299 podStartE2EDuration="4.348434505s" podCreationTimestamp="2026-04-22 18:48:10 +0000 UTC" firstStartedPulling="2026-04-22 18:48:11.169949588 +0000 UTC m=+626.808474689" lastFinishedPulling="2026-04-22 18:48:13.779439778 +0000 UTC m=+629.417964895" observedRunningTime="2026-04-22 18:48:14.347853338 +0000 UTC m=+629.986378460" watchObservedRunningTime="2026-04-22 18:48:14.348434505 +0000 UTC m=+629.986959628" Apr 22 18:48:25.337368 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:48:25.337331 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-hn5dl" Apr 22 18:49:04.234676 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.234634 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s"] Apr 22 18:49:04.237282 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.237265 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.240605 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.240576 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:49:04.240764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.240631 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:49:04.240764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.240578 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-ksj8g\"" Apr 22 18:49:04.240764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.240584 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:49:04.240764 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.240589 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 18:49:04.251073 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.251049 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s"] Apr 22 18:49:04.354748 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.354695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.354911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.354774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.354911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.354811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.354911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.354844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.355042 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.355013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ec42ac4-29c4-4823-ba82-df299444b76e-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.355100 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.355076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxmbq\" (UniqueName: \"kubernetes.io/projected/9ec42ac4-29c4-4823-ba82-df299444b76e-kube-api-access-pxmbq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456316 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456316 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456316 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456316 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ec42ac4-29c4-4823-ba82-df299444b76e-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456577 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxmbq\" (UniqueName: \"kubernetes.io/projected/9ec42ac4-29c4-4823-ba82-df299444b76e-kube-api-access-pxmbq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456577 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.456687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.456666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.458803 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.458785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ec42ac4-29c4-4823-ba82-df299444b76e-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.464762 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.464738 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxmbq\" (UniqueName: \"kubernetes.io/projected/9ec42ac4-29c4-4823-ba82-df299444b76e-kube-api-access-pxmbq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.550941 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.550865 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:04.674224 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.674196 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s"] Apr 22 18:49:04.675660 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:49:04.675633 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec42ac4_29c4_4823_ba82_df299444b76e.slice/crio-e9fad7424d8a143e22c1fe8b22f20ed89ce8e51dc4ec366b46ef364f2ba10cca WatchSource:0}: Error finding container e9fad7424d8a143e22c1fe8b22f20ed89ce8e51dc4ec366b46ef364f2ba10cca: Status 404 returned error can't find the container with id e9fad7424d8a143e22c1fe8b22f20ed89ce8e51dc4ec366b46ef364f2ba10cca Apr 22 18:49:04.677701 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:04.677680 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:49:05.552906 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:05.552853 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" event={"ID":"9ec42ac4-29c4-4823-ba82-df299444b76e","Type":"ContainerStarted","Data":"e9fad7424d8a143e22c1fe8b22f20ed89ce8e51dc4ec366b46ef364f2ba10cca"} Apr 22 18:49:08.566756 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:08.566651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" event={"ID":"9ec42ac4-29c4-4823-ba82-df299444b76e","Type":"ContainerStarted","Data":"de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d"} Apr 22 18:49:09.574901 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:09.574865 2577 generic.go:358] "Generic (PLEG): container finished" podID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerID="de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d" exitCode=0 Apr 22 18:49:09.575278 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:09.574922 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" event={"ID":"9ec42ac4-29c4-4823-ba82-df299444b76e","Type":"ContainerDied","Data":"de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d"} Apr 22 18:49:10.580442 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:10.580367 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" event={"ID":"9ec42ac4-29c4-4823-ba82-df299444b76e","Type":"ContainerStarted","Data":"a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7"} Apr 22 18:49:40.715599 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:40.715557 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" event={"ID":"9ec42ac4-29c4-4823-ba82-df299444b76e","Type":"ContainerStarted","Data":"2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41"} Apr 22 18:49:40.716158 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:40.715750 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:40.718513 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:40.718490 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:40.740355 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:40.740312 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" podStartSLOduration=1.060978089 podStartE2EDuration="36.74029839s" podCreationTimestamp="2026-04-22 18:49:04 +0000 UTC" firstStartedPulling="2026-04-22 18:49:04.677878634 +0000 UTC m=+680.316403738" lastFinishedPulling="2026-04-22 18:49:40.357198934 +0000 UTC m=+715.995724039" observedRunningTime="2026-04-22 18:49:40.738129139 +0000 UTC m=+716.376654261" watchObservedRunningTime="2026-04-22 18:49:40.74029839 +0000 UTC m=+716.378823511" Apr 22 18:49:44.551295 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:44.551261 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:44.551699 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:44.551310 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:54.553447 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:54.553415 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:49:54.554492 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:49:54.554467 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:51:07.209748 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.209636 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g"] Apr 22 18:51:07.212684 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.212663 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.214895 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.214869 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-s4n9h\"" Apr 22 18:51:07.214999 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.214950 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 18:51:07.222509 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.222486 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g"] Apr 22 18:51:07.384972 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.384944 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfc7h\" (UniqueName: \"kubernetes.io/projected/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kube-api-access-xfc7h\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.385134 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.384996 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.385134 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.385053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.385134 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.385098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.385246 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.385159 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.385246 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.385181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.486498 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.486420 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfc7h\" (UniqueName: \"kubernetes.io/projected/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kube-api-access-xfc7h\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.486498 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.486492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.486744 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.486513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.486744 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.486534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.486744 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.486574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.486744 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.486602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.487049 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.487023 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.487133 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.487049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.487193 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.487136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.487193 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.487156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.489017 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.488999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.494672 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.494651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfc7h\" (UniqueName: \"kubernetes.io/projected/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kube-api-access-xfc7h\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.524576 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.524547 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:07.657678 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:07.657653 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g"] Apr 22 18:51:07.660499 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:51:07.660476 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcab36a5b_029d_4278_b31d_b399d0b3a2f9.slice/crio-d885625784f0f3ed3a96ac3988b809b3dfc9af0db70d3ece5f9d37c66c6e4084 WatchSource:0}: Error finding container d885625784f0f3ed3a96ac3988b809b3dfc9af0db70d3ece5f9d37c66c6e4084: Status 404 returned error can't find the container with id d885625784f0f3ed3a96ac3988b809b3dfc9af0db70d3ece5f9d37c66c6e4084 Apr 22 18:51:08.055838 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:08.055793 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" event={"ID":"cab36a5b-029d-4278-b31d-b399d0b3a2f9","Type":"ContainerStarted","Data":"7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494"} Apr 22 18:51:08.055838 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:08.055834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" event={"ID":"cab36a5b-029d-4278-b31d-b399d0b3a2f9","Type":"ContainerStarted","Data":"d885625784f0f3ed3a96ac3988b809b3dfc9af0db70d3ece5f9d37c66c6e4084"} Apr 22 18:51:09.061102 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:09.061068 2577 generic.go:358] "Generic (PLEG): container finished" podID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerID="7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494" exitCode=0 Apr 22 18:51:09.061478 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:09.061122 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" event={"ID":"cab36a5b-029d-4278-b31d-b399d0b3a2f9","Type":"ContainerDied","Data":"7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494"} Apr 22 18:51:10.067010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:10.066972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" event={"ID":"cab36a5b-029d-4278-b31d-b399d0b3a2f9","Type":"ContainerStarted","Data":"fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348"} Apr 22 18:51:10.067010 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:10.067021 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" event={"ID":"cab36a5b-029d-4278-b31d-b399d0b3a2f9","Type":"ContainerStarted","Data":"0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42"} Apr 22 18:51:10.067427 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:10.067134 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:10.092119 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:10.092070 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" podStartSLOduration=3.092055956 podStartE2EDuration="3.092055956s" podCreationTimestamp="2026-04-22 18:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:10.088867423 +0000 UTC m=+805.727392546" watchObservedRunningTime="2026-04-22 18:51:10.092055956 +0000 UTC m=+805.730581077" Apr 22 18:51:17.525554 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:17.525520 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:17.526140 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:17.525566 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:17.528314 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:17.528291 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:18.103780 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:18.103749 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:39.107938 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:39.107908 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:51:59.812336 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:59.812300 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s"] Apr 22 18:51:59.812924 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:59.812738 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="main" containerID="cri-o://a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7" gracePeriod=30 Apr 22 18:51:59.812924 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:51:59.812824 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="tokenizer" containerID="cri-o://2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41" gracePeriod=30 Apr 22 18:52:00.266844 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:00.266808 2577 generic.go:358] "Generic (PLEG): container finished" podID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerID="a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7" exitCode=0 Apr 22 18:52:00.267012 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:00.266846 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" event={"ID":"9ec42ac4-29c4-4823-ba82-df299444b76e","Type":"ContainerDied","Data":"a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7"} Apr 22 18:52:00.717666 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:52:00.717625 2577 logging.go:55] [core] [Channel #125 SubChannel #126]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.47:9003", ServerName: "10.133.0.47:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.47:9003: connect: connection refused" Apr 22 18:52:00.974786 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:00.974700 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:52:01.073734 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.073684 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-uds\") pod \"9ec42ac4-29c4-4823-ba82-df299444b76e\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " Apr 22 18:52:01.073899 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.073803 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxmbq\" (UniqueName: \"kubernetes.io/projected/9ec42ac4-29c4-4823-ba82-df299444b76e-kube-api-access-pxmbq\") pod \"9ec42ac4-29c4-4823-ba82-df299444b76e\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " Apr 22 18:52:01.073899 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.073842 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-cache\") pod \"9ec42ac4-29c4-4823-ba82-df299444b76e\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " Apr 22 18:52:01.073899 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.073869 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-tmp\") pod \"9ec42ac4-29c4-4823-ba82-df299444b76e\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " Apr 22 18:52:01.074214 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.073898 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-kserve-provision-location\") pod \"9ec42ac4-29c4-4823-ba82-df299444b76e\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " Apr 22 18:52:01.074214 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.073924 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ec42ac4-29c4-4823-ba82-df299444b76e-tls-certs\") pod \"9ec42ac4-29c4-4823-ba82-df299444b76e\" (UID: \"9ec42ac4-29c4-4823-ba82-df299444b76e\") " Apr 22 18:52:01.074214 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.073952 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9ec42ac4-29c4-4823-ba82-df299444b76e" (UID: "9ec42ac4-29c4-4823-ba82-df299444b76e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:01.074214 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.074127 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9ec42ac4-29c4-4823-ba82-df299444b76e" (UID: "9ec42ac4-29c4-4823-ba82-df299444b76e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:01.074214 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.074199 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:01.074214 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.074207 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9ec42ac4-29c4-4823-ba82-df299444b76e" (UID: "9ec42ac4-29c4-4823-ba82-df299444b76e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:01.074543 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.074219 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-uds\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:01.074879 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.074851 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ec42ac4-29c4-4823-ba82-df299444b76e" (UID: "9ec42ac4-29c4-4823-ba82-df299444b76e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:01.076124 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.076094 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec42ac4-29c4-4823-ba82-df299444b76e-kube-api-access-pxmbq" (OuterVolumeSpecName: "kube-api-access-pxmbq") pod "9ec42ac4-29c4-4823-ba82-df299444b76e" (UID: "9ec42ac4-29c4-4823-ba82-df299444b76e"). InnerVolumeSpecName "kube-api-access-pxmbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:01.076124 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.076116 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ec42ac4-29c4-4823-ba82-df299444b76e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9ec42ac4-29c4-4823-ba82-df299444b76e" (UID: "9ec42ac4-29c4-4823-ba82-df299444b76e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:52:01.174992 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.174953 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxmbq\" (UniqueName: \"kubernetes.io/projected/9ec42ac4-29c4-4823-ba82-df299444b76e-kube-api-access-pxmbq\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:01.174992 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.174988 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-tokenizer-tmp\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:01.174992 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.174998 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ec42ac4-29c4-4823-ba82-df299444b76e-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:01.175215 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.175008 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ec42ac4-29c4-4823-ba82-df299444b76e-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:01.272959 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.272880 2577 generic.go:358] "Generic (PLEG): container finished" podID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerID="2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41" exitCode=0 Apr 22 18:52:01.273117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.272969 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" Apr 22 18:52:01.273117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.272966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" event={"ID":"9ec42ac4-29c4-4823-ba82-df299444b76e","Type":"ContainerDied","Data":"2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41"} Apr 22 18:52:01.273117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.273013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" event={"ID":"9ec42ac4-29c4-4823-ba82-df299444b76e","Type":"ContainerDied","Data":"e9fad7424d8a143e22c1fe8b22f20ed89ce8e51dc4ec366b46ef364f2ba10cca"} Apr 22 18:52:01.273117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.273033 2577 scope.go:117] "RemoveContainer" containerID="2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41" Apr 22 18:52:01.282526 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.282511 2577 scope.go:117] "RemoveContainer" containerID="a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7" Apr 22 18:52:01.290423 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.290409 2577 scope.go:117] "RemoveContainer" containerID="de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d" Apr 22 18:52:01.297078 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.297054 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s"] Apr 22 18:52:01.298835 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.298820 2577 scope.go:117] "RemoveContainer" containerID="2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41" Apr 22 18:52:01.299132 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:52:01.299108 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41\": container with ID starting with 2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41 not found: ID does not exist" containerID="2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41" Apr 22 18:52:01.299233 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.299137 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41"} err="failed to get container status \"2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41\": rpc error: code = NotFound desc = could not find container \"2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41\": container with ID starting with 2e785fedeb63f5d16edddaab8c7e366033ad708df576e9fcb1ba0a80fa174d41 not found: ID does not exist" Apr 22 18:52:01.299233 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.299158 2577 scope.go:117] "RemoveContainer" containerID="a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7" Apr 22 18:52:01.299399 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:52:01.299378 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7\": container with ID starting with a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7 not found: ID does not exist" containerID="a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7" Apr 22 18:52:01.299439 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.299404 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7"} err="failed to get container status \"a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7\": rpc error: code = NotFound desc = could not find container \"a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7\": container with ID starting with a085c7a87dc7e3331e5ecaafeec396812ca2fcc145d6271ed72a3b92079812a7 not found: ID does not exist" Apr 22 18:52:01.299439 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.299419 2577 scope.go:117] "RemoveContainer" containerID="de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d" Apr 22 18:52:01.299696 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:52:01.299673 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d\": container with ID starting with de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d not found: ID does not exist" containerID="de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d" Apr 22 18:52:01.299807 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.299703 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d"} err="failed to get container status \"de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d\": rpc error: code = NotFound desc = could not find container \"de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d\": container with ID starting with de0739834228bfb33587294081496551810c860fd6a545db5ec4ff7dab4db92d not found: ID does not exist" Apr 22 18:52:01.300866 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.300850 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s"] Apr 22 18:52:01.716967 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:01.716931 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-69754d877s" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.47:9003\" within 1s: context deadline exceeded" Apr 22 18:52:02.885469 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:02.885437 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" path="/var/lib/kubelet/pods/9ec42ac4-29c4-4823-ba82-df299444b76e/volumes" Apr 22 18:52:07.712919 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.712885 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh"] Apr 22 18:52:07.713553 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.713508 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="main" Apr 22 18:52:07.713687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.713613 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="main" Apr 22 18:52:07.713687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.713657 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="storage-initializer" Apr 22 18:52:07.713687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.713666 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="storage-initializer" Apr 22 18:52:07.713687 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.713681 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="tokenizer" Apr 22 18:52:07.713936 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.713689 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="tokenizer" Apr 22 18:52:07.713936 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.713811 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="tokenizer" Apr 22 18:52:07.713936 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.713837 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ec42ac4-29c4-4823-ba82-df299444b76e" containerName="main" Apr 22 18:52:07.716853 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.716834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.719107 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.719088 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-9ld8c\"" Apr 22 18:52:07.719225 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.719152 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 18:52:07.729412 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.729388 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh"] Apr 22 18:52:07.841644 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.841607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.842023 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.842003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.842136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.842089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/78dcce1b-42e4-464f-be96-8b06099ba402-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.842199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.842151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85xz\" (UniqueName: \"kubernetes.io/projected/78dcce1b-42e4-464f-be96-8b06099ba402-kube-api-access-d85xz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.842254 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.842219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.842305 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.842260 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.943393 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.943362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.943550 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.943417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.943550 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.943457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/78dcce1b-42e4-464f-be96-8b06099ba402-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.943550 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.943506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d85xz\" (UniqueName: \"kubernetes.io/projected/78dcce1b-42e4-464f-be96-8b06099ba402-kube-api-access-d85xz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.943550 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.943543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.943802 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.943607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.943885 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.943852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.944005 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.943982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.944139 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.944107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.944139 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.944113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.946445 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.946426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/78dcce1b-42e4-464f-be96-8b06099ba402-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:07.952309 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:07.952287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85xz\" (UniqueName: \"kubernetes.io/projected/78dcce1b-42e4-464f-be96-8b06099ba402-kube-api-access-d85xz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:08.028800 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:08.028705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:08.155811 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:08.155783 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh"] Apr 22 18:52:08.157367 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:52:08.157342 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78dcce1b_42e4_464f_be96_8b06099ba402.slice/crio-4df5ecb6740270543ac9251789551710dd2508aed8f81fb9b83d781f69add03e WatchSource:0}: Error finding container 4df5ecb6740270543ac9251789551710dd2508aed8f81fb9b83d781f69add03e: Status 404 returned error can't find the container with id 4df5ecb6740270543ac9251789551710dd2508aed8f81fb9b83d781f69add03e Apr 22 18:52:08.301498 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:08.301406 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" event={"ID":"78dcce1b-42e4-464f-be96-8b06099ba402","Type":"ContainerStarted","Data":"5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be"} Apr 22 18:52:08.301498 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:08.301443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" event={"ID":"78dcce1b-42e4-464f-be96-8b06099ba402","Type":"ContainerStarted","Data":"4df5ecb6740270543ac9251789551710dd2508aed8f81fb9b83d781f69add03e"} Apr 22 18:52:09.308136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:09.308057 2577 generic.go:358] "Generic (PLEG): container finished" podID="78dcce1b-42e4-464f-be96-8b06099ba402" containerID="5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be" exitCode=0 Apr 22 18:52:09.308136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:09.308112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" event={"ID":"78dcce1b-42e4-464f-be96-8b06099ba402","Type":"ContainerDied","Data":"5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be"} Apr 22 18:52:10.314352 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:10.314314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" event={"ID":"78dcce1b-42e4-464f-be96-8b06099ba402","Type":"ContainerStarted","Data":"66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e"} Apr 22 18:52:10.314352 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:10.314357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" event={"ID":"78dcce1b-42e4-464f-be96-8b06099ba402","Type":"ContainerStarted","Data":"81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f"} Apr 22 18:52:10.314864 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:10.314423 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:10.336623 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:10.336578 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" podStartSLOduration=3.336562487 podStartE2EDuration="3.336562487s" podCreationTimestamp="2026-04-22 18:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:52:10.334774703 +0000 UTC m=+865.973299826" watchObservedRunningTime="2026-04-22 18:52:10.336562487 +0000 UTC m=+865.975087608" Apr 22 18:52:18.028912 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:18.028871 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:18.028912 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:18.028920 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:18.031590 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:18.031567 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:18.348268 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:18.348176 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:39.351998 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:39.351913 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:40.600294 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:40.600260 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh"] Apr 22 18:52:40.600821 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:40.600631 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="main" containerID="cri-o://81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f" gracePeriod=30 Apr 22 18:52:40.600821 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:40.600689 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="tokenizer" containerID="cri-o://66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e" gracePeriod=30 Apr 22 18:52:41.438768 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.438705 2577 generic.go:358] "Generic (PLEG): container finished" podID="78dcce1b-42e4-464f-be96-8b06099ba402" containerID="81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f" exitCode=0 Apr 22 18:52:41.438955 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.438763 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" event={"ID":"78dcce1b-42e4-464f-be96-8b06099ba402","Type":"ContainerDied","Data":"81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f"} Apr 22 18:52:41.756543 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.756522 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:41.758871 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.758850 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-kserve-provision-location\") pod \"78dcce1b-42e4-464f-be96-8b06099ba402\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " Apr 22 18:52:41.758988 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.758898 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/78dcce1b-42e4-464f-be96-8b06099ba402-tls-certs\") pod \"78dcce1b-42e4-464f-be96-8b06099ba402\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " Apr 22 18:52:41.758988 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.758926 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-uds\") pod \"78dcce1b-42e4-464f-be96-8b06099ba402\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " Apr 22 18:52:41.758988 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.758955 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-tmp\") pod \"78dcce1b-42e4-464f-be96-8b06099ba402\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " Apr 22 18:52:41.759165 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759002 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-cache\") pod \"78dcce1b-42e4-464f-be96-8b06099ba402\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " Apr 22 18:52:41.759165 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759052 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d85xz\" (UniqueName: \"kubernetes.io/projected/78dcce1b-42e4-464f-be96-8b06099ba402-kube-api-access-d85xz\") pod \"78dcce1b-42e4-464f-be96-8b06099ba402\" (UID: \"78dcce1b-42e4-464f-be96-8b06099ba402\") " Apr 22 18:52:41.759259 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759238 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "78dcce1b-42e4-464f-be96-8b06099ba402" (UID: "78dcce1b-42e4-464f-be96-8b06099ba402"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:41.759312 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759281 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "78dcce1b-42e4-464f-be96-8b06099ba402" (UID: "78dcce1b-42e4-464f-be96-8b06099ba402"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:41.759362 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759330 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "78dcce1b-42e4-464f-be96-8b06099ba402" (UID: "78dcce1b-42e4-464f-be96-8b06099ba402"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:41.759435 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759418 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-uds\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:41.759435 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759434 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-tmp\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:41.759609 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759444 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-tokenizer-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:41.759757 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.759730 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "78dcce1b-42e4-464f-be96-8b06099ba402" (UID: "78dcce1b-42e4-464f-be96-8b06099ba402"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:41.761004 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.760980 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78dcce1b-42e4-464f-be96-8b06099ba402-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "78dcce1b-42e4-464f-be96-8b06099ba402" (UID: "78dcce1b-42e4-464f-be96-8b06099ba402"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:52:41.761089 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.761041 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78dcce1b-42e4-464f-be96-8b06099ba402-kube-api-access-d85xz" (OuterVolumeSpecName: "kube-api-access-d85xz") pod "78dcce1b-42e4-464f-be96-8b06099ba402" (UID: "78dcce1b-42e4-464f-be96-8b06099ba402"). InnerVolumeSpecName "kube-api-access-d85xz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:41.860136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.860109 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d85xz\" (UniqueName: \"kubernetes.io/projected/78dcce1b-42e4-464f-be96-8b06099ba402-kube-api-access-d85xz\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:41.860136 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.860132 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78dcce1b-42e4-464f-be96-8b06099ba402-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:41.860317 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:41.860142 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/78dcce1b-42e4-464f-be96-8b06099ba402-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:52:42.444181 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.444145 2577 generic.go:358] "Generic (PLEG): container finished" podID="78dcce1b-42e4-464f-be96-8b06099ba402" containerID="66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e" exitCode=0 Apr 22 18:52:42.444357 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.444224 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" Apr 22 18:52:42.444357 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.444219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" event={"ID":"78dcce1b-42e4-464f-be96-8b06099ba402","Type":"ContainerDied","Data":"66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e"} Apr 22 18:52:42.444357 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.444324 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh" event={"ID":"78dcce1b-42e4-464f-be96-8b06099ba402","Type":"ContainerDied","Data":"4df5ecb6740270543ac9251789551710dd2508aed8f81fb9b83d781f69add03e"} Apr 22 18:52:42.444357 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.444340 2577 scope.go:117] "RemoveContainer" containerID="66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e" Apr 22 18:52:42.453731 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.453700 2577 scope.go:117] "RemoveContainer" containerID="81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f" Apr 22 18:52:42.461493 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.461474 2577 scope.go:117] "RemoveContainer" containerID="5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be" Apr 22 18:52:42.468198 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.468175 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh"] Apr 22 18:52:42.471533 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.471512 2577 scope.go:117] "RemoveContainer" containerID="66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e" Apr 22 18:52:42.471803 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:52:42.471782 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e\": container with ID starting with 66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e not found: ID does not exist" containerID="66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e" Apr 22 18:52:42.471869 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.471810 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e"} err="failed to get container status \"66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e\": rpc error: code = NotFound desc = could not find container \"66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e\": container with ID starting with 66d7a74904c75ba3be2ff2cf305a05f9afb4421509f992fcf5b5a8ac6335b62e not found: ID does not exist" Apr 22 18:52:42.471869 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.471827 2577 scope.go:117] "RemoveContainer" containerID="81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f" Apr 22 18:52:42.471990 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.471972 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-b6fcd64xc9mh"] Apr 22 18:52:42.472065 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:52:42.472049 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f\": container with ID starting with 81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f not found: ID does not exist" containerID="81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f" Apr 22 18:52:42.472106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.472071 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f"} err="failed to get container status \"81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f\": rpc error: code = NotFound desc = could not find container \"81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f\": container with ID starting with 81cd71fa134b58277f3c3582c3a67fb14c1ff003946326ccef38e38721e2135f not found: ID does not exist" Apr 22 18:52:42.472106 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.472090 2577 scope.go:117] "RemoveContainer" containerID="5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be" Apr 22 18:52:42.472317 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:52:42.472300 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be\": container with ID starting with 5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be not found: ID does not exist" containerID="5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be" Apr 22 18:52:42.472354 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.472322 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be"} err="failed to get container status \"5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be\": rpc error: code = NotFound desc = could not find container \"5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be\": container with ID starting with 5be2648624520390da1eabfbeba0271c7b79425378bc8984b45d3b999a7c02be not found: ID does not exist" Apr 22 18:52:42.885065 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:42.884985 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" path="/var/lib/kubelet/pods/78dcce1b-42e4-464f-be96-8b06099ba402/volumes" Apr 22 18:52:50.172242 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.172206 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4"] Apr 22 18:52:50.172821 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.172803 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="main" Apr 22 18:52:50.172878 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.172824 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="main" Apr 22 18:52:50.172878 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.172848 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="storage-initializer" Apr 22 18:52:50.172878 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.172858 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="storage-initializer" Apr 22 18:52:50.172878 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.172869 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="tokenizer" Apr 22 18:52:50.173003 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.172879 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="tokenizer" Apr 22 18:52:50.173003 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.172982 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="main" Apr 22 18:52:50.173003 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.173001 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="78dcce1b-42e4-464f-be96-8b06099ba402" containerName="tokenizer" Apr 22 18:52:50.176270 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.176240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.178616 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.178587 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 18:52:50.178767 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.178628 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-flwsw\"" Apr 22 18:52:50.185684 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.185663 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4"] Apr 22 18:52:50.234411 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.234376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.234595 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.234493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.234595 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.234529 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjjm\" (UniqueName: \"kubernetes.io/projected/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kube-api-access-tcjjm\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.234595 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.234582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.234771 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.234606 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.234771 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.234627 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.335843 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.335802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.336016 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.335892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.336016 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.335959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjjm\" (UniqueName: \"kubernetes.io/projected/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kube-api-access-tcjjm\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.336129 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.336026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.336129 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.336071 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.336129 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.336105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.337017 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.336984 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.337149 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.337054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.337149 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.337054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.337329 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.337230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.338818 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.338795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.346664 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.346640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjjm\" (UniqueName: \"kubernetes.io/projected/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kube-api-access-tcjjm\") pod \"precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.488100 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.488030 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:50.622126 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:50.622103 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4"] Apr 22 18:52:50.624710 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:52:50.624682 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b79a61_7758_4bbb_9f9c_7a5eb365a543.slice/crio-dd02abadadc600ee46f4aa5eaae8c48fb93dc218e964fddee42f70612c43f894 WatchSource:0}: Error finding container dd02abadadc600ee46f4aa5eaae8c48fb93dc218e964fddee42f70612c43f894: Status 404 returned error can't find the container with id dd02abadadc600ee46f4aa5eaae8c48fb93dc218e964fddee42f70612c43f894 Apr 22 18:52:51.480219 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:51.480179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerStarted","Data":"1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3"} Apr 22 18:52:51.480219 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:51.480217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerStarted","Data":"dd02abadadc600ee46f4aa5eaae8c48fb93dc218e964fddee42f70612c43f894"} Apr 22 18:52:52.485800 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:52.485760 2577 generic.go:358] "Generic (PLEG): container finished" podID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerID="1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3" exitCode=0 Apr 22 18:52:52.486192 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:52.485838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerDied","Data":"1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3"} Apr 22 18:52:53.491891 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:53.491848 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerStarted","Data":"8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23"} Apr 22 18:52:53.491891 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:53.491895 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerStarted","Data":"c02e91d39718999dda7092ed287749e77a1922a0702eac4d3a4b8ea495b70b4b"} Apr 22 18:52:53.492428 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:53.492019 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:52:53.513554 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:52:53.513509 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" podStartSLOduration=3.51349568 podStartE2EDuration="3.51349568s" podCreationTimestamp="2026-04-22 18:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:52:53.511147947 +0000 UTC m=+909.149673085" watchObservedRunningTime="2026-04-22 18:52:53.51349568 +0000 UTC m=+909.152020801" Apr 22 18:53:00.488781 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:00.488746 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:53:00.489199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:00.488794 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:53:00.490104 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:53:00.490082 2577 logging.go:55] [core] [Channel #175 SubChannel #176]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.50:9003", ServerName: "10.133.0.50:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.50:9003: connect: connection refused" Apr 22 18:53:00.491337 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:00.491316 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:53:00.527819 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:00.527787 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:53:01.489688 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:01.489642 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.50:9003\" within 1s: context deadline exceeded" Apr 22 18:53:03.539218 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:03.539185 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4_25b79a61-7758-4bbb-9f9c-7a5eb365a543/main/0.log" Apr 22 18:53:03.539640 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:03.539519 2577 generic.go:358] "Generic (PLEG): container finished" podID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerID="c02e91d39718999dda7092ed287749e77a1922a0702eac4d3a4b8ea495b70b4b" exitCode=1 Apr 22 18:53:03.539640 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:03.539590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerDied","Data":"c02e91d39718999dda7092ed287749e77a1922a0702eac4d3a4b8ea495b70b4b"} Apr 22 18:53:03.540056 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:03.540038 2577 scope.go:117] "RemoveContainer" containerID="c02e91d39718999dda7092ed287749e77a1922a0702eac4d3a4b8ea495b70b4b" Apr 22 18:53:04.545184 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:04.545157 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4_25b79a61-7758-4bbb-9f9c-7a5eb365a543/main/0.log" Apr 22 18:53:04.545585 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:04.545497 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerStarted","Data":"e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78"} Apr 22 18:53:04.545883 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:04.545851 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:53:10.488990 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:53:10.488960 2577 logging.go:55] [core] [Channel #183 SubChannel #184]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.50:9003", ServerName: "10.133.0.50:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.50:9003: connect: connection refused" Apr 22 18:53:11.489777 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:11.489706 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.50:9003\" within 1s: context deadline exceeded" Apr 22 18:53:11.490202 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:53:11.489819 2577 logging.go:55] [core] [Channel #183 SubChannel #184]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.50:9003", ServerName: "10.133.0.50:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.50:9003: connect: connection refused" Apr 22 18:53:35.552946 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:35.552915 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:53:36.797932 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:36.797897 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4"] Apr 22 18:53:36.798447 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:36.798210 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="tokenizer" containerID="cri-o://8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23" gracePeriod=30 Apr 22 18:53:36.798447 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:36.798256 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" containerID="cri-o://e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78" gracePeriod=30 Apr 22 18:53:37.676324 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:37.676290 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4_25b79a61-7758-4bbb-9f9c-7a5eb365a543/main/0.log" Apr 22 18:53:37.676633 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:37.676607 2577 generic.go:358] "Generic (PLEG): container finished" podID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerID="e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78" exitCode=0 Apr 22 18:53:37.676759 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:37.676675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerDied","Data":"e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78"} Apr 22 18:53:37.676759 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:37.676746 2577 scope.go:117] "RemoveContainer" containerID="c02e91d39718999dda7092ed287749e77a1922a0702eac4d3a4b8ea495b70b4b" Apr 22 18:53:38.244631 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.244606 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:53:38.285965 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.285885 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tls-certs\") pod \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " Apr 22 18:53:38.285965 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.285932 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-uds\") pod \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " Apr 22 18:53:38.286172 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.285991 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjjm\" (UniqueName: \"kubernetes.io/projected/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kube-api-access-tcjjm\") pod \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " Apr 22 18:53:38.286172 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.286054 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-cache\") pod \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " Apr 22 18:53:38.286494 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.286265 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "25b79a61-7758-4bbb-9f9c-7a5eb365a543" (UID: "25b79a61-7758-4bbb-9f9c-7a5eb365a543"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:38.286494 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.286348 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-tmp\") pod \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " Apr 22 18:53:38.286494 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.286424 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kserve-provision-location\") pod \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\" (UID: \"25b79a61-7758-4bbb-9f9c-7a5eb365a543\") " Apr 22 18:53:38.286494 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.286446 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "25b79a61-7758-4bbb-9f9c-7a5eb365a543" (UID: "25b79a61-7758-4bbb-9f9c-7a5eb365a543"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:38.286877 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.286695 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "25b79a61-7758-4bbb-9f9c-7a5eb365a543" (UID: "25b79a61-7758-4bbb-9f9c-7a5eb365a543"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:38.286877 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.286775 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-uds\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:38.286877 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.286797 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:38.287280 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.287242 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "25b79a61-7758-4bbb-9f9c-7a5eb365a543" (UID: "25b79a61-7758-4bbb-9f9c-7a5eb365a543"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:38.288182 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.288157 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "25b79a61-7758-4bbb-9f9c-7a5eb365a543" (UID: "25b79a61-7758-4bbb-9f9c-7a5eb365a543"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:53:38.288429 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.288409 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kube-api-access-tcjjm" (OuterVolumeSpecName: "kube-api-access-tcjjm") pod "25b79a61-7758-4bbb-9f9c-7a5eb365a543" (UID: "25b79a61-7758-4bbb-9f9c-7a5eb365a543"). InnerVolumeSpecName "kube-api-access-tcjjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:38.387683 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.387654 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tokenizer-tmp\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:38.387683 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.387680 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:38.387866 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.387691 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b79a61-7758-4bbb-9f9c-7a5eb365a543-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:38.387866 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.387702 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tcjjm\" (UniqueName: \"kubernetes.io/projected/25b79a61-7758-4bbb-9f9c-7a5eb365a543-kube-api-access-tcjjm\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:38.682560 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.682522 2577 generic.go:358] "Generic (PLEG): container finished" podID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerID="8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23" exitCode=0 Apr 22 18:53:38.682737 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.682599 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" Apr 22 18:53:38.682737 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.682613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerDied","Data":"8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23"} Apr 22 18:53:38.682737 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.682654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4" event={"ID":"25b79a61-7758-4bbb-9f9c-7a5eb365a543","Type":"ContainerDied","Data":"dd02abadadc600ee46f4aa5eaae8c48fb93dc218e964fddee42f70612c43f894"} Apr 22 18:53:38.682737 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.682673 2577 scope.go:117] "RemoveContainer" containerID="e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78" Apr 22 18:53:38.694609 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.694589 2577 scope.go:117] "RemoveContainer" containerID="8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23" Apr 22 18:53:38.702813 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.702760 2577 scope.go:117] "RemoveContainer" containerID="1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3" Apr 22 18:53:38.708127 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.708104 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4"] Apr 22 18:53:38.711443 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.711368 2577 scope.go:117] "RemoveContainer" containerID="e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78" Apr 22 18:53:38.711834 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:53:38.711802 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78\": container with ID starting with e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78 not found: ID does not exist" containerID="e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78" Apr 22 18:53:38.711924 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.711844 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78"} err="failed to get container status \"e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78\": rpc error: code = NotFound desc = could not find container \"e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78\": container with ID starting with e345975f311897881b36e3998c5421374845d3cf53b8d25de6983a48a4c5fb78 not found: ID does not exist" Apr 22 18:53:38.711924 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.711870 2577 scope.go:117] "RemoveContainer" containerID="8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23" Apr 22 18:53:38.712179 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:53:38.712160 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23\": container with ID starting with 8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23 not found: ID does not exist" containerID="8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23" Apr 22 18:53:38.712242 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.712186 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23"} err="failed to get container status \"8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23\": rpc error: code = NotFound desc = could not find container \"8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23\": container with ID starting with 8bac9d7c1ba3d35b5ddbd544b1ec882f68813f36b62ea92f97a24336e1183b23 not found: ID does not exist" Apr 22 18:53:38.712242 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.712209 2577 scope.go:117] "RemoveContainer" containerID="1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3" Apr 22 18:53:38.712465 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:53:38.712447 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3\": container with ID starting with 1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3 not found: ID does not exist" containerID="1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3" Apr 22 18:53:38.712514 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.712468 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3"} err="failed to get container status \"1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3\": rpc error: code = NotFound desc = could not find container \"1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3\": container with ID starting with 1144c2c5dec6735386d15c5307ea0c9bb1980d6b8ceba9f7a76ea46d6eb327c3 not found: ID does not exist" Apr 22 18:53:38.713834 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.713814 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-9ffb96cdw9gl4"] Apr 22 18:53:38.885297 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:38.885268 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" path="/var/lib/kubelet/pods/25b79a61-7758-4bbb-9f9c-7a5eb365a543/volumes" Apr 22 18:53:47.714362 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:47.714327 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g"] Apr 22 18:53:47.714888 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:47.714767 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="main" containerID="cri-o://0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42" gracePeriod=30 Apr 22 18:53:47.714888 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:47.714813 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="tokenizer" containerID="cri-o://fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348" gracePeriod=30 Apr 22 18:53:48.103357 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.103261 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.48:8082/healthz\": dial tcp 10.133.0.48:8082: connect: connection refused" Apr 22 18:53:48.727469 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.727434 2577 generic.go:358] "Generic (PLEG): container finished" podID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerID="0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42" exitCode=0 Apr 22 18:53:48.727803 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.727514 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" event={"ID":"cab36a5b-029d-4278-b31d-b399d0b3a2f9","Type":"ContainerDied","Data":"0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42"} Apr 22 18:53:48.880037 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.880013 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:53:48.981462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981389 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tls-certs\") pod \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " Apr 22 18:53:48.981462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981422 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kserve-provision-location\") pod \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " Apr 22 18:53:48.981647 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981479 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-uds\") pod \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " Apr 22 18:53:48.981647 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981507 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-tmp\") pod \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " Apr 22 18:53:48.981647 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981526 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfc7h\" (UniqueName: \"kubernetes.io/projected/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kube-api-access-xfc7h\") pod \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " Apr 22 18:53:48.981647 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981571 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-cache\") pod \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\" (UID: \"cab36a5b-029d-4278-b31d-b399d0b3a2f9\") " Apr 22 18:53:48.981891 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981790 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "cab36a5b-029d-4278-b31d-b399d0b3a2f9" (UID: "cab36a5b-029d-4278-b31d-b399d0b3a2f9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:48.981954 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981898 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "cab36a5b-029d-4278-b31d-b399d0b3a2f9" (UID: "cab36a5b-029d-4278-b31d-b399d0b3a2f9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:48.981954 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981931 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "cab36a5b-029d-4278-b31d-b399d0b3a2f9" (UID: "cab36a5b-029d-4278-b31d-b399d0b3a2f9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:48.982027 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981978 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-uds\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:48.982027 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.981991 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-tmp\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:48.982332 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.982308 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cab36a5b-029d-4278-b31d-b399d0b3a2f9" (UID: "cab36a5b-029d-4278-b31d-b399d0b3a2f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:48.983440 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.983422 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cab36a5b-029d-4278-b31d-b399d0b3a2f9" (UID: "cab36a5b-029d-4278-b31d-b399d0b3a2f9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:53:48.983561 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:48.983544 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kube-api-access-xfc7h" (OuterVolumeSpecName: "kube-api-access-xfc7h") pod "cab36a5b-029d-4278-b31d-b399d0b3a2f9" (UID: "cab36a5b-029d-4278-b31d-b399d0b3a2f9"). InnerVolumeSpecName "kube-api-access-xfc7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:49.082878 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.082845 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tokenizer-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:49.082878 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.082875 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cab36a5b-029d-4278-b31d-b399d0b3a2f9-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:49.083062 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.082887 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:49.083062 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.082896 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfc7h\" (UniqueName: \"kubernetes.io/projected/cab36a5b-029d-4278-b31d-b399d0b3a2f9-kube-api-access-xfc7h\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:53:49.734066 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.734030 2577 generic.go:358] "Generic (PLEG): container finished" podID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerID="fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348" exitCode=0 Apr 22 18:53:49.734462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.734111 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" Apr 22 18:53:49.734462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.734121 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" event={"ID":"cab36a5b-029d-4278-b31d-b399d0b3a2f9","Type":"ContainerDied","Data":"fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348"} Apr 22 18:53:49.734462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.734169 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g" event={"ID":"cab36a5b-029d-4278-b31d-b399d0b3a2f9","Type":"ContainerDied","Data":"d885625784f0f3ed3a96ac3988b809b3dfc9af0db70d3ece5f9d37c66c6e4084"} Apr 22 18:53:49.734462 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.734187 2577 scope.go:117] "RemoveContainer" containerID="fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348" Apr 22 18:53:49.743941 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.743923 2577 scope.go:117] "RemoveContainer" containerID="0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42" Apr 22 18:53:49.753658 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.753641 2577 scope.go:117] "RemoveContainer" containerID="7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494" Apr 22 18:53:49.757259 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.757239 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g"] Apr 22 18:53:49.763178 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.763156 2577 scope.go:117] "RemoveContainer" containerID="fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348" Apr 22 18:53:49.763413 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:53:49.763396 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348\": container with ID starting with fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348 not found: ID does not exist" containerID="fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348" Apr 22 18:53:49.763481 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.763421 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348"} err="failed to get container status \"fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348\": rpc error: code = NotFound desc = could not find container \"fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348\": container with ID starting with fce2550cce563347927eb3fb1ed6b6fe92fab18ee13bd64f0b5b11e4aef70348 not found: ID does not exist" Apr 22 18:53:49.763481 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.763438 2577 scope.go:117] "RemoveContainer" containerID="0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42" Apr 22 18:53:49.763679 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:53:49.763664 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42\": container with ID starting with 0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42 not found: ID does not exist" containerID="0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42" Apr 22 18:53:49.763748 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.763681 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42"} err="failed to get container status \"0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42\": rpc error: code = NotFound desc = could not find container \"0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42\": container with ID starting with 0c358aa8a92bf6b5e7cf11725f95744144dc0cc43fb79357b0b89990463a3e42 not found: ID does not exist" Apr 22 18:53:49.763748 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.763693 2577 scope.go:117] "RemoveContainer" containerID="7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494" Apr 22 18:53:49.763939 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:53:49.763915 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494\": container with ID starting with 7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494 not found: ID does not exist" containerID="7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494" Apr 22 18:53:49.764045 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.763943 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494"} err="failed to get container status \"7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494\": rpc error: code = NotFound desc = could not find container \"7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494\": container with ID starting with 7fb24e62620519110314905af6e355bb5bb9a0a803bbb563f6b72d850b562494 not found: ID does not exist" Apr 22 18:53:49.764313 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:49.764296 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheq5p4g"] Apr 22 18:53:50.885179 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:50.885150 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" path="/var/lib/kubelet/pods/cab36a5b-029d-4278-b31d-b399d0b3a2f9/volumes" Apr 22 18:53:58.600624 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.600551 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx"] Apr 22 18:53:58.601001 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.600963 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" Apr 22 18:53:58.601001 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.600979 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" Apr 22 18:53:58.601001 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.600993 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="main" Apr 22 18:53:58.601001 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.600998 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="main" Apr 22 18:53:58.601001 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601005 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="tokenizer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601010 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="tokenizer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601020 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601028 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601059 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="tokenizer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601068 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="tokenizer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601088 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="storage-initializer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601097 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="storage-initializer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601108 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="storage-initializer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601117 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="storage-initializer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601220 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="main" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601233 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="tokenizer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601249 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601262 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cab36a5b-029d-4278-b31d-b399d0b3a2f9" containerName="tokenizer" Apr 22 18:53:58.601344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.601272 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="25b79a61-7758-4bbb-9f9c-7a5eb365a543" containerName="main" Apr 22 18:53:58.606358 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.606335 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.608701 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.608682 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:53:58.608807 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.608682 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 18:53:58.609534 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.609517 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-c9zk6\"" Apr 22 18:53:58.609812 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.609532 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:53:58.610070 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.609552 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:53:58.613965 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.613650 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx"] Apr 22 18:53:58.764564 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.764531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2kcz\" (UniqueName: \"kubernetes.io/projected/c68c85d1-f743-4244-b216-693c4d762202-kube-api-access-c2kcz\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.764771 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.764612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.764771 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.764657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.764771 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.764754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.764973 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.764777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c68c85d1-f743-4244-b216-693c4d762202-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.764973 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.764815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866157 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2kcz\" (UniqueName: \"kubernetes.io/projected/c68c85d1-f743-4244-b216-693c4d762202-kube-api-access-c2kcz\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866157 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866157 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866441 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866441 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c68c85d1-f743-4244-b216-693c4d762202-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866441 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866624 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866685 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866685 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866648 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.866685 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.866675 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.868742 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.868705 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c68c85d1-f743-4244-b216-693c4d762202-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.874318 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.874297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2kcz\" (UniqueName: \"kubernetes.io/projected/c68c85d1-f743-4244-b216-693c4d762202-kube-api-access-c2kcz\") pod \"custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:58.916173 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:58.916137 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:53:59.043960 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:59.043934 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx"] Apr 22 18:53:59.045324 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:53:59.045298 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68c85d1_f743_4244_b216_693c4d762202.slice/crio-462ac0979575113ea3f31b5d78843bc760c945940cc93b51d1cb350f1f8a9028 WatchSource:0}: Error finding container 462ac0979575113ea3f31b5d78843bc760c945940cc93b51d1cb350f1f8a9028: Status 404 returned error can't find the container with id 462ac0979575113ea3f31b5d78843bc760c945940cc93b51d1cb350f1f8a9028 Apr 22 18:53:59.773217 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:59.773179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" event={"ID":"c68c85d1-f743-4244-b216-693c4d762202","Type":"ContainerStarted","Data":"be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8"} Apr 22 18:53:59.773217 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:53:59.773221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" event={"ID":"c68c85d1-f743-4244-b216-693c4d762202","Type":"ContainerStarted","Data":"462ac0979575113ea3f31b5d78843bc760c945940cc93b51d1cb350f1f8a9028"} Apr 22 18:54:00.778501 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:00.778464 2577 generic.go:358] "Generic (PLEG): container finished" podID="c68c85d1-f743-4244-b216-693c4d762202" containerID="be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8" exitCode=0 Apr 22 18:54:00.778902 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:00.778552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" event={"ID":"c68c85d1-f743-4244-b216-693c4d762202","Type":"ContainerDied","Data":"be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8"} Apr 22 18:54:01.785669 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:01.785631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" event={"ID":"c68c85d1-f743-4244-b216-693c4d762202","Type":"ContainerStarted","Data":"c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541"} Apr 22 18:54:01.785669 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:01.785672 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" event={"ID":"c68c85d1-f743-4244-b216-693c4d762202","Type":"ContainerStarted","Data":"9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b"} Apr 22 18:54:01.786114 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:01.785766 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:54:01.808756 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:01.808697 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" podStartSLOduration=3.808684959 podStartE2EDuration="3.808684959s" podCreationTimestamp="2026-04-22 18:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:54:01.804910705 +0000 UTC m=+977.443435828" watchObservedRunningTime="2026-04-22 18:54:01.808684959 +0000 UTC m=+977.447210131" Apr 22 18:54:08.916963 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:08.916932 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:54:08.916963 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:08.916960 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:54:08.919812 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:08.919787 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:54:09.816508 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:09.816479 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:54:30.820211 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:54:30.820181 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:57:07.328518 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:07.328420 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx"] Apr 22 18:57:07.329107 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:07.328863 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="main" containerID="cri-o://9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b" gracePeriod=30 Apr 22 18:57:07.329107 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:07.328927 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="tokenizer" containerID="cri-o://c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541" gracePeriod=30 Apr 22 18:57:07.526015 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:07.525982 2577 generic.go:358] "Generic (PLEG): container finished" podID="c68c85d1-f743-4244-b216-693c4d762202" containerID="9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b" exitCode=0 Apr 22 18:57:07.526195 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:07.526034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" event={"ID":"c68c85d1-f743-4244-b216-693c4d762202","Type":"ContainerDied","Data":"9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b"} Apr 22 18:57:08.478591 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.478570 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:57:08.531152 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.531066 2577 generic.go:358] "Generic (PLEG): container finished" podID="c68c85d1-f743-4244-b216-693c4d762202" containerID="c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541" exitCode=0 Apr 22 18:57:08.531309 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.531157 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" Apr 22 18:57:08.531309 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.531157 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" event={"ID":"c68c85d1-f743-4244-b216-693c4d762202","Type":"ContainerDied","Data":"c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541"} Apr 22 18:57:08.531309 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.531202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx" event={"ID":"c68c85d1-f743-4244-b216-693c4d762202","Type":"ContainerDied","Data":"462ac0979575113ea3f31b5d78843bc760c945940cc93b51d1cb350f1f8a9028"} Apr 22 18:57:08.531309 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.531220 2577 scope.go:117] "RemoveContainer" containerID="c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541" Apr 22 18:57:08.539611 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.539601 2577 scope.go:117] "RemoveContainer" containerID="9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b" Apr 22 18:57:08.539817 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.539801 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c68c85d1-f743-4244-b216-693c4d762202-tls-certs\") pod \"c68c85d1-f743-4244-b216-693c4d762202\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " Apr 22 18:57:08.539936 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.539921 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2kcz\" (UniqueName: \"kubernetes.io/projected/c68c85d1-f743-4244-b216-693c4d762202-kube-api-access-c2kcz\") pod \"c68c85d1-f743-4244-b216-693c4d762202\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " Apr 22 18:57:08.539993 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.539966 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-uds\") pod \"c68c85d1-f743-4244-b216-693c4d762202\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " Apr 22 18:57:08.540050 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.539997 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-tmp\") pod \"c68c85d1-f743-4244-b216-693c4d762202\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " Apr 22 18:57:08.540050 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.540033 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-kserve-provision-location\") pod \"c68c85d1-f743-4244-b216-693c4d762202\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " Apr 22 18:57:08.540159 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.540084 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-cache\") pod \"c68c85d1-f743-4244-b216-693c4d762202\" (UID: \"c68c85d1-f743-4244-b216-693c4d762202\") " Apr 22 18:57:08.540285 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.540260 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c68c85d1-f743-4244-b216-693c4d762202" (UID: "c68c85d1-f743-4244-b216-693c4d762202"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:08.540371 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.540342 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c68c85d1-f743-4244-b216-693c4d762202" (UID: "c68c85d1-f743-4244-b216-693c4d762202"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:08.540564 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.540542 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c68c85d1-f743-4244-b216-693c4d762202" (UID: "c68c85d1-f743-4244-b216-693c4d762202"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:08.540646 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.540585 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-uds\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:57:08.540646 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.540599 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-tmp\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:57:08.541295 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.541271 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c68c85d1-f743-4244-b216-693c4d762202" (UID: "c68c85d1-f743-4244-b216-693c4d762202"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:08.542095 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.542076 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68c85d1-f743-4244-b216-693c4d762202-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c68c85d1-f743-4244-b216-693c4d762202" (UID: "c68c85d1-f743-4244-b216-693c4d762202"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:08.542255 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.542229 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68c85d1-f743-4244-b216-693c4d762202-kube-api-access-c2kcz" (OuterVolumeSpecName: "kube-api-access-c2kcz") pod "c68c85d1-f743-4244-b216-693c4d762202" (UID: "c68c85d1-f743-4244-b216-693c4d762202"). InnerVolumeSpecName "kube-api-access-c2kcz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:08.559396 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.559375 2577 scope.go:117] "RemoveContainer" containerID="be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8" Apr 22 18:57:08.566679 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.566662 2577 scope.go:117] "RemoveContainer" containerID="c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541" Apr 22 18:57:08.566976 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:57:08.566954 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541\": container with ID starting with c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541 not found: ID does not exist" containerID="c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541" Apr 22 18:57:08.567079 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.566982 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541"} err="failed to get container status \"c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541\": rpc error: code = NotFound desc = could not find container \"c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541\": container with ID starting with c4d54330080d693632f8722195d9b247bd35836940c7a4dacf07885670041541 not found: ID does not exist" Apr 22 18:57:08.567079 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.567000 2577 scope.go:117] "RemoveContainer" containerID="9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b" Apr 22 18:57:08.567264 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:57:08.567244 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b\": container with ID starting with 9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b not found: ID does not exist" containerID="9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b" Apr 22 18:57:08.567327 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.567269 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b"} err="failed to get container status \"9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b\": rpc error: code = NotFound desc = could not find container \"9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b\": container with ID starting with 9121b82765d49d11c131724e1e09897e8474fda04a1a853e994bb8ac221d378b not found: ID does not exist" Apr 22 18:57:08.567327 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.567287 2577 scope.go:117] "RemoveContainer" containerID="be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8" Apr 22 18:57:08.567498 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:57:08.567477 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8\": container with ID starting with be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8 not found: ID does not exist" containerID="be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8" Apr 22 18:57:08.567541 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.567508 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8"} err="failed to get container status \"be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8\": rpc error: code = NotFound desc = could not find container \"be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8\": container with ID starting with be62f836f86f4c44687da7601c92a10153c6f4e0e0616f85f8b13207484b77f8 not found: ID does not exist" Apr 22 18:57:08.642034 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.642012 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c2kcz\" (UniqueName: \"kubernetes.io/projected/c68c85d1-f743-4244-b216-693c4d762202-kube-api-access-c2kcz\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:57:08.642034 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.642032 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:57:08.642161 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.642042 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c68c85d1-f743-4244-b216-693c4d762202-tokenizer-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:57:08.642161 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.642053 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c68c85d1-f743-4244-b216-693c4d762202-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:57:08.854945 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.854915 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx"] Apr 22 18:57:08.859083 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.859059 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7746f9cc5tktx"] Apr 22 18:57:08.885155 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:08.885118 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68c85d1-f743-4244-b216-693c4d762202" path="/var/lib/kubelet/pods/c68c85d1-f743-4244-b216-693c4d762202/volumes" Apr 22 18:57:23.374236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374200 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb"] Apr 22 18:57:23.374739 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374700 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="main" Apr 22 18:57:23.374739 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374728 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="main" Apr 22 18:57:23.374836 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374760 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="tokenizer" Apr 22 18:57:23.374836 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374775 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="tokenizer" Apr 22 18:57:23.374836 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374784 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="storage-initializer" Apr 22 18:57:23.374836 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374790 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="storage-initializer" Apr 22 18:57:23.374968 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374857 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="tokenizer" Apr 22 18:57:23.374968 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.374864 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c68c85d1-f743-4244-b216-693c4d762202" containerName="main" Apr 22 18:57:23.380374 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.380353 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.382611 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.382589 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:57:23.383445 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.383421 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 18:57:23.383445 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.383438 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:57:23.383586 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.383433 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-ckmdn\"" Apr 22 18:57:23.383586 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.383510 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:57:23.391567 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.391546 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb"] Apr 22 18:57:23.470277 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.470250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.470442 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.470288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.470442 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.470356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dec66b1-20c2-44b9-b4b4-195bf240809a-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.470442 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.470414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.470442 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.470440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcjlc\" (UniqueName: \"kubernetes.io/projected/6dec66b1-20c2-44b9-b4b4-195bf240809a-kube-api-access-wcjlc\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.470576 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.470538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.571812 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.571774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.571970 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.571832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dec66b1-20c2-44b9-b4b4-195bf240809a-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.571970 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.571885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.571970 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.571920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcjlc\" (UniqueName: \"kubernetes.io/projected/6dec66b1-20c2-44b9-b4b4-195bf240809a-kube-api-access-wcjlc\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.572110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.571981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.572110 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.572047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.572242 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.572218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.572242 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.572231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.572360 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.572350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.572420 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.572397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.574392 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.574372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dec66b1-20c2-44b9-b4b4-195bf240809a-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.579810 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.579787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcjlc\" (UniqueName: \"kubernetes.io/projected/6dec66b1-20c2-44b9-b4b4-195bf240809a-kube-api-access-wcjlc\") pod \"router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.690139 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.690113 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:23.828868 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.828844 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb"] Apr 22 18:57:23.830348 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:57:23.830315 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dec66b1_20c2_44b9_b4b4_195bf240809a.slice/crio-74ed0a45c1859a5264ab373aa61208e60f8e40b48f36b8fb82406ba62fa95914 WatchSource:0}: Error finding container 74ed0a45c1859a5264ab373aa61208e60f8e40b48f36b8fb82406ba62fa95914: Status 404 returned error can't find the container with id 74ed0a45c1859a5264ab373aa61208e60f8e40b48f36b8fb82406ba62fa95914 Apr 22 18:57:23.832369 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:23.832348 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:57:24.595988 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:24.595949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" event={"ID":"6dec66b1-20c2-44b9-b4b4-195bf240809a","Type":"ContainerStarted","Data":"22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe"} Apr 22 18:57:24.595988 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:24.595989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" event={"ID":"6dec66b1-20c2-44b9-b4b4-195bf240809a","Type":"ContainerStarted","Data":"74ed0a45c1859a5264ab373aa61208e60f8e40b48f36b8fb82406ba62fa95914"} Apr 22 18:57:25.600945 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:25.600910 2577 generic.go:358] "Generic (PLEG): container finished" podID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerID="22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe" exitCode=0 Apr 22 18:57:25.601345 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:25.600990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" event={"ID":"6dec66b1-20c2-44b9-b4b4-195bf240809a","Type":"ContainerDied","Data":"22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe"} Apr 22 18:57:26.606276 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:26.606241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" event={"ID":"6dec66b1-20c2-44b9-b4b4-195bf240809a","Type":"ContainerStarted","Data":"0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc"} Apr 22 18:57:26.606276 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:26.606278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" event={"ID":"6dec66b1-20c2-44b9-b4b4-195bf240809a","Type":"ContainerStarted","Data":"5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f"} Apr 22 18:57:26.606757 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:26.606358 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:26.629913 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:26.629859 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" podStartSLOduration=3.629840313 podStartE2EDuration="3.629840313s" podCreationTimestamp="2026-04-22 18:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:26.626132817 +0000 UTC m=+1182.264657939" watchObservedRunningTime="2026-04-22 18:57:26.629840313 +0000 UTC m=+1182.268365436" Apr 22 18:57:33.690844 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:33.690807 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:33.690844 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:33.690848 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:33.693507 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:33.693482 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:34.642779 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:34.642742 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:57:55.646793 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:57:55.646762 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:59:39.803478 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:39.803438 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb"] Apr 22 18:59:39.803955 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:39.803781 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="main" containerID="cri-o://5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f" gracePeriod=30 Apr 22 18:59:39.803955 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:39.803846 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="tokenizer" containerID="cri-o://0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc" gracePeriod=30 Apr 22 18:59:40.120125 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:40.120043 2577 generic.go:358] "Generic (PLEG): container finished" podID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerID="5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f" exitCode=0 Apr 22 18:59:40.120125 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:40.120090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" event={"ID":"6dec66b1-20c2-44b9-b4b4-195bf240809a","Type":"ContainerDied","Data":"5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f"} Apr 22 18:59:40.946084 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:40.946063 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:59:41.072659 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.072578 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-uds\") pod \"6dec66b1-20c2-44b9-b4b4-195bf240809a\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " Apr 22 18:59:41.072659 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.072650 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcjlc\" (UniqueName: \"kubernetes.io/projected/6dec66b1-20c2-44b9-b4b4-195bf240809a-kube-api-access-wcjlc\") pod \"6dec66b1-20c2-44b9-b4b4-195bf240809a\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " Apr 22 18:59:41.072911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.072753 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-kserve-provision-location\") pod \"6dec66b1-20c2-44b9-b4b4-195bf240809a\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " Apr 22 18:59:41.072911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.072785 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dec66b1-20c2-44b9-b4b4-195bf240809a-tls-certs\") pod \"6dec66b1-20c2-44b9-b4b4-195bf240809a\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " Apr 22 18:59:41.072911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.072823 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-cache\") pod \"6dec66b1-20c2-44b9-b4b4-195bf240809a\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " Apr 22 18:59:41.072911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.072867 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6dec66b1-20c2-44b9-b4b4-195bf240809a" (UID: "6dec66b1-20c2-44b9-b4b4-195bf240809a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:41.072911 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.072885 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-tmp\") pod \"6dec66b1-20c2-44b9-b4b4-195bf240809a\" (UID: \"6dec66b1-20c2-44b9-b4b4-195bf240809a\") " Apr 22 18:59:41.073180 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.073117 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6dec66b1-20c2-44b9-b4b4-195bf240809a" (UID: "6dec66b1-20c2-44b9-b4b4-195bf240809a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:41.073236 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.073206 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6dec66b1-20c2-44b9-b4b4-195bf240809a" (UID: "6dec66b1-20c2-44b9-b4b4-195bf240809a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:41.073278 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.073259 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-uds\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:59:41.073316 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.073282 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:59:41.073484 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.073465 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6dec66b1-20c2-44b9-b4b4-195bf240809a" (UID: "6dec66b1-20c2-44b9-b4b4-195bf240809a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:41.074783 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.074761 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dec66b1-20c2-44b9-b4b4-195bf240809a-kube-api-access-wcjlc" (OuterVolumeSpecName: "kube-api-access-wcjlc") pod "6dec66b1-20c2-44b9-b4b4-195bf240809a" (UID: "6dec66b1-20c2-44b9-b4b4-195bf240809a"). InnerVolumeSpecName "kube-api-access-wcjlc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:41.074878 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.074777 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dec66b1-20c2-44b9-b4b4-195bf240809a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6dec66b1-20c2-44b9-b4b4-195bf240809a" (UID: "6dec66b1-20c2-44b9-b4b4-195bf240809a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:41.126292 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.126261 2577 generic.go:358] "Generic (PLEG): container finished" podID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerID="0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc" exitCode=0 Apr 22 18:59:41.126456 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.126331 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" event={"ID":"6dec66b1-20c2-44b9-b4b4-195bf240809a","Type":"ContainerDied","Data":"0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc"} Apr 22 18:59:41.126456 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.126342 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" Apr 22 18:59:41.126456 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.126367 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb" event={"ID":"6dec66b1-20c2-44b9-b4b4-195bf240809a","Type":"ContainerDied","Data":"74ed0a45c1859a5264ab373aa61208e60f8e40b48f36b8fb82406ba62fa95914"} Apr 22 18:59:41.126456 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.126389 2577 scope.go:117] "RemoveContainer" containerID="0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc" Apr 22 18:59:41.136765 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.136747 2577 scope.go:117] "RemoveContainer" containerID="5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f" Apr 22 18:59:41.147762 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.147571 2577 scope.go:117] "RemoveContainer" containerID="22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe" Apr 22 18:59:41.150489 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.150467 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb"] Apr 22 18:59:41.155286 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.155263 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-548f476f84-x9ngb"] Apr 22 18:59:41.156262 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.156244 2577 scope.go:117] "RemoveContainer" containerID="0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc" Apr 22 18:59:41.156483 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:59:41.156466 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc\": container with ID starting with 0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc not found: ID does not exist" containerID="0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc" Apr 22 18:59:41.156529 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.156495 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc"} err="failed to get container status \"0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc\": rpc error: code = NotFound desc = could not find container \"0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc\": container with ID starting with 0d054a5f474bbbb7a4e7f8bd6b9e47b185d33a453cbbcff44bdb6a6acc0a57cc not found: ID does not exist" Apr 22 18:59:41.156529 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.156512 2577 scope.go:117] "RemoveContainer" containerID="5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f" Apr 22 18:59:41.156749 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:59:41.156709 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f\": container with ID starting with 5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f not found: ID does not exist" containerID="5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f" Apr 22 18:59:41.156794 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.156755 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f"} err="failed to get container status \"5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f\": rpc error: code = NotFound desc = could not find container \"5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f\": container with ID starting with 5f36537c6f0d974238b96b80d7c31497fdb4d5203b78cb74c2ec477c8c36fc6f not found: ID does not exist" Apr 22 18:59:41.156794 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.156770 2577 scope.go:117] "RemoveContainer" containerID="22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe" Apr 22 18:59:41.157000 ip-10-0-139-10 kubenswrapper[2577]: E0422 18:59:41.156981 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe\": container with ID starting with 22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe not found: ID does not exist" containerID="22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe" Apr 22 18:59:41.157056 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.157009 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe"} err="failed to get container status \"22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe\": rpc error: code = NotFound desc = could not find container \"22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe\": container with ID starting with 22e8b03bb989e05bcb432eca31af26d907eb322ec3482cb27182e5c758e626fe not found: ID does not exist" Apr 22 18:59:41.174123 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.174103 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:59:41.174192 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.174126 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dec66b1-20c2-44b9-b4b4-195bf240809a-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:59:41.174192 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.174135 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dec66b1-20c2-44b9-b4b4-195bf240809a-tokenizer-tmp\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:59:41.174192 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:41.174144 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wcjlc\" (UniqueName: \"kubernetes.io/projected/6dec66b1-20c2-44b9-b4b4-195bf240809a-kube-api-access-wcjlc\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 18:59:42.885597 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:42.885565 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" path="/var/lib/kubelet/pods/6dec66b1-20c2-44b9-b4b4-195bf240809a/volumes" Apr 22 18:59:50.385135 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385099 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc"] Apr 22 18:59:50.385531 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385509 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="main" Apr 22 18:59:50.385531 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385520 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="main" Apr 22 18:59:50.385627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385537 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="storage-initializer" Apr 22 18:59:50.385627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385543 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="storage-initializer" Apr 22 18:59:50.385627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385550 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="tokenizer" Apr 22 18:59:50.385627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385555 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="tokenizer" Apr 22 18:59:50.385627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385614 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="tokenizer" Apr 22 18:59:50.385627 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.385620 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dec66b1-20c2-44b9-b4b4-195bf240809a" containerName="main" Apr 22 18:59:50.390531 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.390495 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.393780 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.393754 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-ck6zx\"" Apr 22 18:59:50.393996 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.393969 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 18:59:50.394199 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.394004 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:59:50.394299 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.394030 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 18:59:50.394299 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.394066 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:59:50.399473 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.399451 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc"] Apr 22 18:59:50.549866 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.549832 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.550053 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.549891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.550053 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.549928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb76m\" (UniqueName: \"kubernetes.io/projected/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kube-api-access-cb76m\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.550053 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.550007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.550203 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.550068 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.550203 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.550091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651500 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651500 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651757 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651757 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651757 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651757 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb76m\" (UniqueName: \"kubernetes.io/projected/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kube-api-access-cb76m\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651972 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651972 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.651972 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.652080 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.651972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.654117 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.654100 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.659968 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.659942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb76m\" (UniqueName: \"kubernetes.io/projected/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kube-api-access-cb76m\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.702414 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.702383 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:50.829302 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:50.829279 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc"] Apr 22 18:59:50.831108 ip-10-0-139-10 kubenswrapper[2577]: W0422 18:59:50.831082 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58ddb8db_6c88_4367_84f0_63d0ffd09ee4.slice/crio-82bf9ac3e629af7b5d1c56b86f41a7b197fbb1cd954a22b17fd8852cb80f084f WatchSource:0}: Error finding container 82bf9ac3e629af7b5d1c56b86f41a7b197fbb1cd954a22b17fd8852cb80f084f: Status 404 returned error can't find the container with id 82bf9ac3e629af7b5d1c56b86f41a7b197fbb1cd954a22b17fd8852cb80f084f Apr 22 18:59:51.176060 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:51.176022 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" event={"ID":"58ddb8db-6c88-4367-84f0-63d0ffd09ee4","Type":"ContainerStarted","Data":"727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b"} Apr 22 18:59:51.176060 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:51.176064 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" event={"ID":"58ddb8db-6c88-4367-84f0-63d0ffd09ee4","Type":"ContainerStarted","Data":"82bf9ac3e629af7b5d1c56b86f41a7b197fbb1cd954a22b17fd8852cb80f084f"} Apr 22 18:59:52.181892 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:52.181858 2577 generic.go:358] "Generic (PLEG): container finished" podID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerID="727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b" exitCode=0 Apr 22 18:59:52.182344 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:52.181948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" event={"ID":"58ddb8db-6c88-4367-84f0-63d0ffd09ee4","Type":"ContainerDied","Data":"727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b"} Apr 22 18:59:53.186962 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:53.186931 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" event={"ID":"58ddb8db-6c88-4367-84f0-63d0ffd09ee4","Type":"ContainerStarted","Data":"cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89"} Apr 22 18:59:53.186962 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:53.186966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" event={"ID":"58ddb8db-6c88-4367-84f0-63d0ffd09ee4","Type":"ContainerStarted","Data":"545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2"} Apr 22 18:59:53.187458 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:53.187186 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 18:59:53.208640 ip-10-0-139-10 kubenswrapper[2577]: I0422 18:59:53.208597 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" podStartSLOduration=3.208583857 podStartE2EDuration="3.208583857s" podCreationTimestamp="2026-04-22 18:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:59:53.205824502 +0000 UTC m=+1328.844349625" watchObservedRunningTime="2026-04-22 18:59:53.208583857 +0000 UTC m=+1328.847109019" Apr 22 19:00:00.703366 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:00:00.703279 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 19:00:00.703366 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:00:00.703334 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 19:00:00.706088 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:00:00.706062 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 19:00:01.221264 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:00:01.221238 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 19:00:22.225752 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:00:22.225710 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 19:02:49.560972 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:49.560929 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc"] Apr 22 19:02:49.561431 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:49.561228 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="main" containerID="cri-o://545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2" gracePeriod=30 Apr 22 19:02:49.561431 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:49.561284 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="tokenizer" containerID="cri-o://cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89" gracePeriod=30 Apr 22 19:02:49.869114 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:49.869039 2577 generic.go:358] "Generic (PLEG): container finished" podID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerID="545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2" exitCode=0 Apr 22 19:02:49.869114 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:49.869078 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" event={"ID":"58ddb8db-6c88-4367-84f0-63d0ffd09ee4","Type":"ContainerDied","Data":"545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2"} Apr 22 19:02:50.720974 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.720952 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 19:02:50.838524 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838450 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kserve-provision-location\") pod \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " Apr 22 19:02:50.838524 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838504 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-tmp\") pod \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " Apr 22 19:02:50.838768 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838550 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tls-certs\") pod \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " Apr 22 19:02:50.838768 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838580 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb76m\" (UniqueName: \"kubernetes.io/projected/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kube-api-access-cb76m\") pod \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " Apr 22 19:02:50.838768 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838606 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-uds\") pod \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " Apr 22 19:02:50.838768 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838656 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-cache\") pod \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\" (UID: \"58ddb8db-6c88-4367-84f0-63d0ffd09ee4\") " Apr 22 19:02:50.838987 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838894 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "58ddb8db-6c88-4367-84f0-63d0ffd09ee4" (UID: "58ddb8db-6c88-4367-84f0-63d0ffd09ee4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:50.838987 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838964 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "58ddb8db-6c88-4367-84f0-63d0ffd09ee4" (UID: "58ddb8db-6c88-4367-84f0-63d0ffd09ee4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:50.839077 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.838988 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "58ddb8db-6c88-4367-84f0-63d0ffd09ee4" (UID: "58ddb8db-6c88-4367-84f0-63d0ffd09ee4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:50.839320 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.839299 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "58ddb8db-6c88-4367-84f0-63d0ffd09ee4" (UID: "58ddb8db-6c88-4367-84f0-63d0ffd09ee4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:50.840652 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.840633 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kube-api-access-cb76m" (OuterVolumeSpecName: "kube-api-access-cb76m") pod "58ddb8db-6c88-4367-84f0-63d0ffd09ee4" (UID: "58ddb8db-6c88-4367-84f0-63d0ffd09ee4"). InnerVolumeSpecName "kube-api-access-cb76m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:02:50.840811 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.840790 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "58ddb8db-6c88-4367-84f0-63d0ffd09ee4" (UID: "58ddb8db-6c88-4367-84f0-63d0ffd09ee4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:02:50.874233 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.874208 2577 generic.go:358] "Generic (PLEG): container finished" podID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerID="cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89" exitCode=0 Apr 22 19:02:50.874337 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.874285 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" Apr 22 19:02:50.874337 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.874296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" event={"ID":"58ddb8db-6c88-4367-84f0-63d0ffd09ee4","Type":"ContainerDied","Data":"cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89"} Apr 22 19:02:50.874420 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.874336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc" event={"ID":"58ddb8db-6c88-4367-84f0-63d0ffd09ee4","Type":"ContainerDied","Data":"82bf9ac3e629af7b5d1c56b86f41a7b197fbb1cd954a22b17fd8852cb80f084f"} Apr 22 19:02:50.874420 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.874354 2577 scope.go:117] "RemoveContainer" containerID="cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89" Apr 22 19:02:50.884235 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.884219 2577 scope.go:117] "RemoveContainer" containerID="545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2" Apr 22 19:02:50.892464 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.892445 2577 scope.go:117] "RemoveContainer" containerID="727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b" Apr 22 19:02:50.899594 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.899568 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc"] Apr 22 19:02:50.901681 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.901665 2577 scope.go:117] "RemoveContainer" containerID="cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89" Apr 22 19:02:50.902050 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:02:50.902029 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89\": container with ID starting with cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89 not found: ID does not exist" containerID="cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89" Apr 22 19:02:50.902133 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.902057 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89"} err="failed to get container status \"cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89\": rpc error: code = NotFound desc = could not find container \"cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89\": container with ID starting with cefe786fd9e6c919dc30d9febc3aff110fef2a70b9f6779b706cde89aa09be89 not found: ID does not exist" Apr 22 19:02:50.902133 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.902075 2577 scope.go:117] "RemoveContainer" containerID="545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2" Apr 22 19:02:50.902341 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:02:50.902315 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2\": container with ID starting with 545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2 not found: ID does not exist" containerID="545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2" Apr 22 19:02:50.902396 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.902353 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2"} err="failed to get container status \"545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2\": rpc error: code = NotFound desc = could not find container \"545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2\": container with ID starting with 545757a3a23be9fb479c449d352bb1ab75f55ae40a78ba2121ae584521cdead2 not found: ID does not exist" Apr 22 19:02:50.902437 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.902376 2577 scope.go:117] "RemoveContainer" containerID="727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b" Apr 22 19:02:50.902656 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:02:50.902632 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b\": container with ID starting with 727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b not found: ID does not exist" containerID="727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b" Apr 22 19:02:50.902779 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.902660 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b"} err="failed to get container status \"727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b\": rpc error: code = NotFound desc = could not find container \"727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b\": container with ID starting with 727b0b3bb212784b5085c444dbffdffafb0146e1fdf9b847e7092f601e0d655b not found: ID does not exist" Apr 22 19:02:50.904495 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.904477 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tnfc"] Apr 22 19:02:50.939764 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.939744 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-uds\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:02:50.939764 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.939763 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:02:50.939867 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.939773 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:02:50.939867 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.939784 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tokenizer-tmp\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:02:50.939867 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.939792 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:02:50.939867 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:50.939801 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cb76m\" (UniqueName: \"kubernetes.io/projected/58ddb8db-6c88-4367-84f0-63d0ffd09ee4-kube-api-access-cb76m\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:02:52.884774 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:52.884743 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" path="/var/lib/kubelet/pods/58ddb8db-6c88-4367-84f0-63d0ffd09ee4/volumes" Apr 22 19:02:58.446671 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.446586 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg"] Apr 22 19:02:58.447151 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.447054 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="tokenizer" Apr 22 19:02:58.447151 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.447069 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="tokenizer" Apr 22 19:02:58.447151 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.447085 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="storage-initializer" Apr 22 19:02:58.447151 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.447091 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="storage-initializer" Apr 22 19:02:58.447151 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.447101 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="main" Apr 22 19:02:58.447151 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.447106 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="main" Apr 22 19:02:58.447343 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.447172 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="main" Apr 22 19:02:58.447343 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.447181 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="58ddb8db-6c88-4367-84f0-63d0ffd09ee4" containerName="tokenizer" Apr 22 19:02:58.452237 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.452218 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.455306 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.455286 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 19:02:58.455420 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.455287 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:02:58.455420 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.455287 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:02:58.455420 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.455287 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hsgld\"" Apr 22 19:02:58.455546 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.455287 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-jnc2l\"" Apr 22 19:02:58.461745 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.461708 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg"] Apr 22 19:02:58.505027 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.505001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.505176 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.505071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.505176 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.505130 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.505278 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.505182 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.505278 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.505233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqcsk\" (UniqueName: \"kubernetes.io/projected/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kube-api-access-tqcsk\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.505278 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.505272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606030 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606159 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606236 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606276 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606327 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqcsk\" (UniqueName: \"kubernetes.io/projected/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kube-api-access-tqcsk\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606429 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606429 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606541 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606541 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.606672 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.606651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.608726 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.608702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.614203 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.614183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqcsk\" (UniqueName: \"kubernetes.io/projected/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kube-api-access-tqcsk\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.764137 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.764046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:02:58.891854 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.891830 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg"] Apr 22 19:02:58.893911 ip-10-0-139-10 kubenswrapper[2577]: W0422 19:02:58.893873 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbdff8d5_c5cd_4b98_813a_e6a212b151a1.slice/crio-6b9f8ccce5e0b3f32a9e705d2b47fdefc60be8d51f8a9e3acf8d1f18ee5f53ad WatchSource:0}: Error finding container 6b9f8ccce5e0b3f32a9e705d2b47fdefc60be8d51f8a9e3acf8d1f18ee5f53ad: Status 404 returned error can't find the container with id 6b9f8ccce5e0b3f32a9e705d2b47fdefc60be8d51f8a9e3acf8d1f18ee5f53ad Apr 22 19:02:58.895746 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.895706 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:02:58.909610 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:58.909584 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" event={"ID":"dbdff8d5-c5cd-4b98-813a-e6a212b151a1","Type":"ContainerStarted","Data":"6b9f8ccce5e0b3f32a9e705d2b47fdefc60be8d51f8a9e3acf8d1f18ee5f53ad"} Apr 22 19:02:59.915677 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:59.915637 2577 generic.go:358] "Generic (PLEG): container finished" podID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerID="bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef" exitCode=0 Apr 22 19:02:59.916124 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:02:59.915733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" event={"ID":"dbdff8d5-c5cd-4b98-813a-e6a212b151a1","Type":"ContainerDied","Data":"bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef"} Apr 22 19:03:00.921448 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:00.921414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" event={"ID":"dbdff8d5-c5cd-4b98-813a-e6a212b151a1","Type":"ContainerStarted","Data":"4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355"} Apr 22 19:03:00.921448 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:00.921448 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" event={"ID":"dbdff8d5-c5cd-4b98-813a-e6a212b151a1","Type":"ContainerStarted","Data":"2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea"} Apr 22 19:03:00.921903 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:00.921564 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:03:00.943064 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:00.943010 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" podStartSLOduration=2.942996623 podStartE2EDuration="2.942996623s" podCreationTimestamp="2026-04-22 19:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:03:00.940421828 +0000 UTC m=+1516.578946951" watchObservedRunningTime="2026-04-22 19:03:00.942996623 +0000 UTC m=+1516.581521744" Apr 22 19:03:08.764400 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:08.764355 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:03:08.764971 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:08.764410 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:03:08.766893 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:08.766867 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:03:08.958589 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:08.958554 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:03:29.963402 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:29.963374 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:03:48.891734 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.891693 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:03:48.896609 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.896590 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:48.899045 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.899017 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-5k2nv\"" Apr 22 19:03:48.899155 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.899047 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 19:03:48.907648 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.907563 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:03:48.957904 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.957873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:48.957904 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.957912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa38dbc-26cc-4873-80cb-8063021d80d9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:48.958133 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.957934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p46rj\" (UniqueName: \"kubernetes.io/projected/5aa38dbc-26cc-4873-80cb-8063021d80d9-kube-api-access-p46rj\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:48.958133 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.957956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:48.958133 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.957985 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:48.958133 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:48.958013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059333 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p46rj\" (UniqueName: \"kubernetes.io/projected/5aa38dbc-26cc-4873-80cb-8063021d80d9-kube-api-access-p46rj\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059333 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059579 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059387 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059579 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059579 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059579 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa38dbc-26cc-4873-80cb-8063021d80d9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059824 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059885 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.059953 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.059935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.061704 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.061680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.061909 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.061892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa38dbc-26cc-4873-80cb-8063021d80d9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.067793 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.067773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p46rj\" (UniqueName: \"kubernetes.io/projected/5aa38dbc-26cc-4873-80cb-8063021d80d9-kube-api-access-p46rj\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.209405 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.209373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:03:49.337321 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:49.337297 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:03:49.339017 ip-10-0-139-10 kubenswrapper[2577]: W0422 19:03:49.338989 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aa38dbc_26cc_4873_80cb_8063021d80d9.slice/crio-c6a8d12fd7bef2d6c1225e902814c85e253ecc0ed9703816138f689a02cd0cac WatchSource:0}: Error finding container c6a8d12fd7bef2d6c1225e902814c85e253ecc0ed9703816138f689a02cd0cac: Status 404 returned error can't find the container with id c6a8d12fd7bef2d6c1225e902814c85e253ecc0ed9703816138f689a02cd0cac Apr 22 19:03:50.131111 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:50.131065 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5aa38dbc-26cc-4873-80cb-8063021d80d9","Type":"ContainerStarted","Data":"2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b"} Apr 22 19:03:50.131111 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:50.131114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5aa38dbc-26cc-4873-80cb-8063021d80d9","Type":"ContainerStarted","Data":"c6a8d12fd7bef2d6c1225e902814c85e253ecc0ed9703816138f689a02cd0cac"} Apr 22 19:03:55.153226 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:55.153196 2577 generic.go:358] "Generic (PLEG): container finished" podID="5aa38dbc-26cc-4873-80cb-8063021d80d9" containerID="2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b" exitCode=0 Apr 22 19:03:55.153613 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:03:55.153275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5aa38dbc-26cc-4873-80cb-8063021d80d9","Type":"ContainerDied","Data":"2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b"} Apr 22 19:04:22.278038 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:04:22.278000 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5aa38dbc-26cc-4873-80cb-8063021d80d9","Type":"ContainerStarted","Data":"41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39"} Apr 22 19:04:22.296690 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:04:22.296640 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=7.630654272 podStartE2EDuration="34.29662514s" podCreationTimestamp="2026-04-22 19:03:48 +0000 UTC" firstStartedPulling="2026-04-22 19:03:55.154398336 +0000 UTC m=+1570.792923436" lastFinishedPulling="2026-04-22 19:04:21.8203692 +0000 UTC m=+1597.458894304" observedRunningTime="2026-04-22 19:04:22.294850474 +0000 UTC m=+1597.933375593" watchObservedRunningTime="2026-04-22 19:04:22.29662514 +0000 UTC m=+1597.935150262" Apr 22 19:06:07.909929 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:07.909830 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg"] Apr 22 19:06:07.910503 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:07.910225 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="main" containerID="cri-o://2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea" gracePeriod=30 Apr 22 19:06:07.910503 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:07.910335 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="tokenizer" containerID="cri-o://4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355" gracePeriod=30 Apr 22 19:06:07.966036 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:07.966003 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv"] Apr 22 19:06:07.970215 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:07.970190 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:07.973532 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:07.973354 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-pzzhm\"" Apr 22 19:06:07.973532 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:07.973354 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 22 19:06:07.981045 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:07.981017 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv"] Apr 22 19:06:08.034118 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.034248 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.034293 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.034331 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.034366 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1bdaf5e8-f724-418d-891d-690a795024e9-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.034432 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.034502 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034484 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j5kl\" (UniqueName: \"kubernetes.io/projected/1bdaf5e8-f724-418d-891d-690a795024e9-kube-api-access-5j5kl\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.034547 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1bdaf5e8-f724-418d-891d-690a795024e9-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.034583 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.034568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1bdaf5e8-f724-418d-891d-690a795024e9-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.135819 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.135778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.135819 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.135821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136088 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.135849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1bdaf5e8-f724-418d-891d-690a795024e9-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136088 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.135885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136088 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.135912 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5j5kl\" (UniqueName: \"kubernetes.io/projected/1bdaf5e8-f724-418d-891d-690a795024e9-kube-api-access-5j5kl\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136088 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.135944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1bdaf5e8-f724-418d-891d-690a795024e9-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136088 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.136009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1bdaf5e8-f724-418d-891d-690a795024e9-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136088 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.136072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136402 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.136114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136402 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.136308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136402 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.136323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136402 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.136347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136612 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.136596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.136914 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.136893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1bdaf5e8-f724-418d-891d-690a795024e9-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.138479 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.138445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1bdaf5e8-f724-418d-891d-690a795024e9-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.138839 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.138792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1bdaf5e8-f724-418d-891d-690a795024e9-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.145440 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.145419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1bdaf5e8-f724-418d-891d-690a795024e9-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.145817 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.145794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j5kl\" (UniqueName: \"kubernetes.io/projected/1bdaf5e8-f724-418d-891d-690a795024e9-kube-api-access-5j5kl\") pod \"router-gateway-2-openshift-default-6866b85949-fdjqv\" (UID: \"1bdaf5e8-f724-418d-891d-690a795024e9\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.288153 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.287911 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:08.441936 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.441908 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv"] Apr 22 19:06:08.443767 ip-10-0-139-10 kubenswrapper[2577]: W0422 19:06:08.443734 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bdaf5e8_f724_418d_891d_690a795024e9.slice/crio-1c09a1bd78b41b6233f3b0eb46489ef056f9f74b31de595b36c94b92aff448cb WatchSource:0}: Error finding container 1c09a1bd78b41b6233f3b0eb46489ef056f9f74b31de595b36c94b92aff448cb: Status 404 returned error can't find the container with id 1c09a1bd78b41b6233f3b0eb46489ef056f9f74b31de595b36c94b92aff448cb Apr 22 19:06:08.445943 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.445907 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 19:06:08.446050 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.445985 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 19:06:08.446050 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.446024 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 19:06:08.723143 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.723108 2577 generic.go:358] "Generic (PLEG): container finished" podID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerID="2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea" exitCode=0 Apr 22 19:06:08.723376 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.723193 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" event={"ID":"dbdff8d5-c5cd-4b98-813a-e6a212b151a1","Type":"ContainerDied","Data":"2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea"} Apr 22 19:06:08.724660 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.724633 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" event={"ID":"1bdaf5e8-f724-418d-891d-690a795024e9","Type":"ContainerStarted","Data":"cd70e7ba99e4082be38cf7de9a632f076f07fd7c4d21886917ae395515973524"} Apr 22 19:06:08.724817 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.724665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" event={"ID":"1bdaf5e8-f724-418d-891d-690a795024e9","Type":"ContainerStarted","Data":"1c09a1bd78b41b6233f3b0eb46489ef056f9f74b31de595b36c94b92aff448cb"} Apr 22 19:06:08.747531 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.747473 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" podStartSLOduration=1.747457257 podStartE2EDuration="1.747457257s" podCreationTimestamp="2026-04-22 19:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:08.743546995 +0000 UTC m=+1704.382072120" watchObservedRunningTime="2026-04-22 19:06:08.747457257 +0000 UTC m=+1704.385982379" Apr 22 19:06:08.958187 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:08.958140 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.54:8082/healthz\": dial tcp 10.133.0.54:8082: connect: connection refused" Apr 22 19:06:09.290298 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.290220 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:06:09.290689 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.290669 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:09.295487 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.295468 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:09.348967 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.348938 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-uds\") pod \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " Apr 22 19:06:09.349152 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.348977 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-tmp\") pod \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " Apr 22 19:06:09.349152 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349009 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqcsk\" (UniqueName: \"kubernetes.io/projected/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kube-api-access-tqcsk\") pod \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " Apr 22 19:06:09.349152 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349069 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tls-certs\") pod \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " Apr 22 19:06:09.349324 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349271 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "dbdff8d5-c5cd-4b98-813a-e6a212b151a1" (UID: "dbdff8d5-c5cd-4b98-813a-e6a212b151a1"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:09.349324 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349312 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-cache\") pod \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " Apr 22 19:06:09.349434 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349392 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kserve-provision-location\") pod \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\" (UID: \"dbdff8d5-c5cd-4b98-813a-e6a212b151a1\") " Apr 22 19:06:09.349567 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349543 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "dbdff8d5-c5cd-4b98-813a-e6a212b151a1" (UID: "dbdff8d5-c5cd-4b98-813a-e6a212b151a1"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:09.349692 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349624 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "dbdff8d5-c5cd-4b98-813a-e6a212b151a1" (UID: "dbdff8d5-c5cd-4b98-813a-e6a212b151a1"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:09.349850 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349832 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:09.349918 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349856 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-uds\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:09.349918 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.349870 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tokenizer-tmp\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:09.350282 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.350252 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dbdff8d5-c5cd-4b98-813a-e6a212b151a1" (UID: "dbdff8d5-c5cd-4b98-813a-e6a212b151a1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:09.351361 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.351342 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dbdff8d5-c5cd-4b98-813a-e6a212b151a1" (UID: "dbdff8d5-c5cd-4b98-813a-e6a212b151a1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:09.351598 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.351578 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kube-api-access-tqcsk" (OuterVolumeSpecName: "kube-api-access-tqcsk") pod "dbdff8d5-c5cd-4b98-813a-e6a212b151a1" (UID: "dbdff8d5-c5cd-4b98-813a-e6a212b151a1"). InnerVolumeSpecName "kube-api-access-tqcsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:09.451057 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.451010 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:09.451057 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.451057 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tqcsk\" (UniqueName: \"kubernetes.io/projected/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-kube-api-access-tqcsk\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:09.451268 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.451073 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbdff8d5-c5cd-4b98-813a-e6a212b151a1-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:09.736470 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.736426 2577 generic.go:358] "Generic (PLEG): container finished" podID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerID="4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355" exitCode=0 Apr 22 19:06:09.736669 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.736513 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" Apr 22 19:06:09.736669 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.736509 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" event={"ID":"dbdff8d5-c5cd-4b98-813a-e6a212b151a1","Type":"ContainerDied","Data":"4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355"} Apr 22 19:06:09.736669 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.736556 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg" event={"ID":"dbdff8d5-c5cd-4b98-813a-e6a212b151a1","Type":"ContainerDied","Data":"6b9f8ccce5e0b3f32a9e705d2b47fdefc60be8d51f8a9e3acf8d1f18ee5f53ad"} Apr 22 19:06:09.736669 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.736577 2577 scope.go:117] "RemoveContainer" containerID="4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355" Apr 22 19:06:09.737312 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.737158 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:09.738292 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.738264 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-fdjqv" Apr 22 19:06:09.746264 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.746247 2577 scope.go:117] "RemoveContainer" containerID="2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea" Apr 22 19:06:09.754465 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.754449 2577 scope.go:117] "RemoveContainer" containerID="bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef" Apr 22 19:06:09.764068 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.763803 2577 scope.go:117] "RemoveContainer" containerID="4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355" Apr 22 19:06:09.764143 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:06:09.764123 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355\": container with ID starting with 4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355 not found: ID does not exist" containerID="4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355" Apr 22 19:06:09.764197 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.764156 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355"} err="failed to get container status \"4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355\": rpc error: code = NotFound desc = could not find container \"4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355\": container with ID starting with 4c662800293a110e8fa9d5103763ef25917a6bcd6053d618751b2d7e90445355 not found: ID does not exist" Apr 22 19:06:09.764197 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.764179 2577 scope.go:117] "RemoveContainer" containerID="2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea" Apr 22 19:06:09.764436 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:06:09.764415 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea\": container with ID starting with 2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea not found: ID does not exist" containerID="2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea" Apr 22 19:06:09.764555 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.764438 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea"} err="failed to get container status \"2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea\": rpc error: code = NotFound desc = could not find container \"2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea\": container with ID starting with 2573fb50c7f6a5d72c1ab2b31bf4687ce049ddc8c41c76cc6248da2422c04aea not found: ID does not exist" Apr 22 19:06:09.764555 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.764453 2577 scope.go:117] "RemoveContainer" containerID="bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef" Apr 22 19:06:09.764709 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:06:09.764690 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef\": container with ID starting with bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef not found: ID does not exist" containerID="bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef" Apr 22 19:06:09.764791 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.764726 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef"} err="failed to get container status \"bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef\": rpc error: code = NotFound desc = could not find container \"bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef\": container with ID starting with bea04812041b55ae1e0494a8e957cb6eeefe5568243f125fb83d38b3b743edef not found: ID does not exist" Apr 22 19:06:09.780061 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.780037 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg"] Apr 22 19:06:09.789008 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:09.788981 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5f58cd5vqg"] Apr 22 19:06:10.884976 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:10.884947 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" path="/var/lib/kubelet/pods/dbdff8d5-c5cd-4b98-813a-e6a212b151a1/volumes" Apr 22 19:06:25.791126 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791092 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh"] Apr 22 19:06:25.791527 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791513 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="storage-initializer" Apr 22 19:06:25.791572 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791530 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="storage-initializer" Apr 22 19:06:25.791572 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791544 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="tokenizer" Apr 22 19:06:25.791572 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791550 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="tokenizer" Apr 22 19:06:25.791572 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791564 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="main" Apr 22 19:06:25.791572 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791570 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="main" Apr 22 19:06:25.791755 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791647 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="tokenizer" Apr 22 19:06:25.791755 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.791660 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbdff8d5-c5cd-4b98-813a-e6a212b151a1" containerName="main" Apr 22 19:06:25.796653 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.796633 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.800332 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.800305 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-ksc56\"" Apr 22 19:06:25.800591 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.800571 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 19:06:25.803080 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.803057 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh"] Apr 22 19:06:25.818094 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.818073 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n"] Apr 22 19:06:25.822102 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.822084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.834758 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.834734 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n"] Apr 22 19:06:25.897666 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.897626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.897666 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.897671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-home\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.897903 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.897689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.897903 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.897774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8644bad-55fe-48ed-9971-5c691355194c-tls-certs\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.897903 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.897807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.897903 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.897831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-model-cache\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.898134 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.897922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-dshm\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.898134 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.897959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-home\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.898134 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.898052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4gr\" (UniqueName: \"kubernetes.io/projected/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kube-api-access-nl4gr\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.898134 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.898108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52mdj\" (UniqueName: \"kubernetes.io/projected/f8644bad-55fe-48ed-9971-5c691355194c-kube-api-access-52mdj\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.898315 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.898137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.898315 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.898177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.999233 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-dshm\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.999233 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-home\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.999547 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4gr\" (UniqueName: \"kubernetes.io/projected/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kube-api-access-nl4gr\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.999547 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52mdj\" (UniqueName: \"kubernetes.io/projected/f8644bad-55fe-48ed-9971-5c691355194c-kube-api-access-52mdj\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.999547 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.999743 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.999790 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.999845 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999795 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-home\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.999845 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:25.999927 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8644bad-55fe-48ed-9971-5c691355194c-tls-certs\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:25.999927 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:26.000028 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-home\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:26.000028 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:25.999952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-model-cache\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:26.000188 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.000168 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:26.000286 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.000267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:26.000286 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.000277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-model-cache\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:26.000564 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.000534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:26.000662 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.000594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-home\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:26.002063 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.002040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-dshm\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:26.002249 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.002234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:26.002566 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.002545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8644bad-55fe-48ed-9971-5c691355194c-tls-certs\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:26.002631 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.002609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:26.010516 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.010493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4gr\" (UniqueName: \"kubernetes.io/projected/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kube-api-access-nl4gr\") pod \"router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:26.011467 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.011443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52mdj\" (UniqueName: \"kubernetes.io/projected/f8644bad-55fe-48ed-9971-5c691355194c-kube-api-access-52mdj\") pod \"router-with-refs-pd-test-kserve-54cbd5b894-xl9kh\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:26.110258 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.110183 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:26.136990 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.136960 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:26.260904 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.260871 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh"] Apr 22 19:06:26.261331 ip-10-0-139-10 kubenswrapper[2577]: W0422 19:06:26.261300 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8644bad_55fe_48ed_9971_5c691355194c.slice/crio-3c58b924a355e0f57455f4a8f6b7188d6bbe6729b699377c420ee1ef2c45f689 WatchSource:0}: Error finding container 3c58b924a355e0f57455f4a8f6b7188d6bbe6729b699377c420ee1ef2c45f689: Status 404 returned error can't find the container with id 3c58b924a355e0f57455f4a8f6b7188d6bbe6729b699377c420ee1ef2c45f689 Apr 22 19:06:26.282884 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.282861 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n"] Apr 22 19:06:26.284153 ip-10-0-139-10 kubenswrapper[2577]: W0422 19:06:26.284133 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f04d7ec_bd13_4bf8_9389_460dca58e4e0.slice/crio-be5faaccfe869051984852ffccde21fd5229307714f7e12dbf9a248ac6ba4e6d WatchSource:0}: Error finding container be5faaccfe869051984852ffccde21fd5229307714f7e12dbf9a248ac6ba4e6d: Status 404 returned error can't find the container with id be5faaccfe869051984852ffccde21fd5229307714f7e12dbf9a248ac6ba4e6d Apr 22 19:06:26.811791 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.811751 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" event={"ID":"f8644bad-55fe-48ed-9971-5c691355194c","Type":"ContainerStarted","Data":"3c58b924a355e0f57455f4a8f6b7188d6bbe6729b699377c420ee1ef2c45f689"} Apr 22 19:06:26.814164 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.813747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" event={"ID":"7f04d7ec-bd13-4bf8-9389-460dca58e4e0","Type":"ContainerStarted","Data":"dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3"} Apr 22 19:06:26.814164 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:26.813788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" event={"ID":"7f04d7ec-bd13-4bf8-9389-460dca58e4e0","Type":"ContainerStarted","Data":"be5faaccfe869051984852ffccde21fd5229307714f7e12dbf9a248ac6ba4e6d"} Apr 22 19:06:27.819319 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:27.819274 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" event={"ID":"f8644bad-55fe-48ed-9971-5c691355194c","Type":"ContainerStarted","Data":"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527"} Apr 22 19:06:27.820080 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:27.819351 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:28.825423 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:28.825387 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" event={"ID":"f8644bad-55fe-48ed-9971-5c691355194c","Type":"ContainerStarted","Data":"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e"} Apr 22 19:06:30.662777 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:06:30.662744 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f04d7ec_bd13_4bf8_9389_460dca58e4e0.slice/crio-conmon-dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3.scope\": RecentStats: unable to find data in memory cache]" Apr 22 19:06:30.836672 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:30.836594 2577 generic.go:358] "Generic (PLEG): container finished" podID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerID="dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3" exitCode=0 Apr 22 19:06:30.836867 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:30.836673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" event={"ID":"7f04d7ec-bd13-4bf8-9389-460dca58e4e0","Type":"ContainerDied","Data":"dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3"} Apr 22 19:06:31.844459 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:31.844418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" event={"ID":"7f04d7ec-bd13-4bf8-9389-460dca58e4e0","Type":"ContainerStarted","Data":"5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9"} Apr 22 19:06:31.867507 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:31.867448 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podStartSLOduration=6.867432153 podStartE2EDuration="6.867432153s" podCreationTimestamp="2026-04-22 19:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:31.864649511 +0000 UTC m=+1727.503174632" watchObservedRunningTime="2026-04-22 19:06:31.867432153 +0000 UTC m=+1727.505957274" Apr 22 19:06:33.855218 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:33.855177 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8644bad-55fe-48ed-9971-5c691355194c" containerID="0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e" exitCode=0 Apr 22 19:06:33.855640 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:33.855229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" event={"ID":"f8644bad-55fe-48ed-9971-5c691355194c","Type":"ContainerDied","Data":"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e"} Apr 22 19:06:34.861535 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:34.861488 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" event={"ID":"f8644bad-55fe-48ed-9971-5c691355194c","Type":"ContainerStarted","Data":"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb"} Apr 22 19:06:34.886892 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:34.886834 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podStartSLOduration=8.989146646 podStartE2EDuration="9.886816031s" podCreationTimestamp="2026-04-22 19:06:25 +0000 UTC" firstStartedPulling="2026-04-22 19:06:26.263263467 +0000 UTC m=+1721.901788568" lastFinishedPulling="2026-04-22 19:06:27.160932853 +0000 UTC m=+1722.799457953" observedRunningTime="2026-04-22 19:06:34.883370492 +0000 UTC m=+1730.521895616" watchObservedRunningTime="2026-04-22 19:06:34.886816031 +0000 UTC m=+1730.525341154" Apr 22 19:06:36.110663 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:36.110619 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:36.111074 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:36.110781 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:36.112462 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:36.112428 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:06:36.137898 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:36.137864 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:36.138015 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:36.137911 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:06:36.139393 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:36.139360 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:06:46.110645 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:46.110596 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:06:46.137767 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:46.137700 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:06:46.886645 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:46.886600 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:06:55.688501 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:55.688465 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:06:55.688929 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:55.688773 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="5aa38dbc-26cc-4873-80cb-8063021d80d9" containerName="main" containerID="cri-o://41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39" gracePeriod=30 Apr 22 19:06:56.110834 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.110703 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:06:56.137596 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.137550 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:06:56.476329 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.476307 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:06:56.618031 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.618002 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-kserve-provision-location\") pod \"5aa38dbc-26cc-4873-80cb-8063021d80d9\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " Apr 22 19:06:56.618236 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.618060 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-dshm\") pod \"5aa38dbc-26cc-4873-80cb-8063021d80d9\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " Apr 22 19:06:56.618236 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.618101 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-home\") pod \"5aa38dbc-26cc-4873-80cb-8063021d80d9\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " Apr 22 19:06:56.618236 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.618118 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa38dbc-26cc-4873-80cb-8063021d80d9-tls-certs\") pod \"5aa38dbc-26cc-4873-80cb-8063021d80d9\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " Apr 22 19:06:56.618236 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.618134 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p46rj\" (UniqueName: \"kubernetes.io/projected/5aa38dbc-26cc-4873-80cb-8063021d80d9-kube-api-access-p46rj\") pod \"5aa38dbc-26cc-4873-80cb-8063021d80d9\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " Apr 22 19:06:56.618236 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.618165 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-model-cache\") pod \"5aa38dbc-26cc-4873-80cb-8063021d80d9\" (UID: \"5aa38dbc-26cc-4873-80cb-8063021d80d9\") " Apr 22 19:06:56.618573 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.618546 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-home" (OuterVolumeSpecName: "home") pod "5aa38dbc-26cc-4873-80cb-8063021d80d9" (UID: "5aa38dbc-26cc-4873-80cb-8063021d80d9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:56.618640 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.618615 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-model-cache" (OuterVolumeSpecName: "model-cache") pod "5aa38dbc-26cc-4873-80cb-8063021d80d9" (UID: "5aa38dbc-26cc-4873-80cb-8063021d80d9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:56.620380 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.620354 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-dshm" (OuterVolumeSpecName: "dshm") pod "5aa38dbc-26cc-4873-80cb-8063021d80d9" (UID: "5aa38dbc-26cc-4873-80cb-8063021d80d9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:56.620850 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.620830 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa38dbc-26cc-4873-80cb-8063021d80d9-kube-api-access-p46rj" (OuterVolumeSpecName: "kube-api-access-p46rj") pod "5aa38dbc-26cc-4873-80cb-8063021d80d9" (UID: "5aa38dbc-26cc-4873-80cb-8063021d80d9"). InnerVolumeSpecName "kube-api-access-p46rj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:56.620986 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.620966 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aa38dbc-26cc-4873-80cb-8063021d80d9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5aa38dbc-26cc-4873-80cb-8063021d80d9" (UID: "5aa38dbc-26cc-4873-80cb-8063021d80d9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:56.646901 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.646867 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5aa38dbc-26cc-4873-80cb-8063021d80d9" (UID: "5aa38dbc-26cc-4873-80cb-8063021d80d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:56.719077 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.719030 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:56.719077 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.719066 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-dshm\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:56.719077 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.719076 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-home\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:56.719077 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.719084 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa38dbc-26cc-4873-80cb-8063021d80d9-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:56.719531 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.719093 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p46rj\" (UniqueName: \"kubernetes.io/projected/5aa38dbc-26cc-4873-80cb-8063021d80d9-kube-api-access-p46rj\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:56.719531 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.719101 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5aa38dbc-26cc-4873-80cb-8063021d80d9-model-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:06:56.959778 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.959695 2577 generic.go:358] "Generic (PLEG): container finished" podID="5aa38dbc-26cc-4873-80cb-8063021d80d9" containerID="41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39" exitCode=0 Apr 22 19:06:56.959778 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.959756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5aa38dbc-26cc-4873-80cb-8063021d80d9","Type":"ContainerDied","Data":"41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39"} Apr 22 19:06:56.960025 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.959786 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:06:56.960025 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.959808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"5aa38dbc-26cc-4873-80cb-8063021d80d9","Type":"ContainerDied","Data":"c6a8d12fd7bef2d6c1225e902814c85e253ecc0ed9703816138f689a02cd0cac"} Apr 22 19:06:56.960025 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.959829 2577 scope.go:117] "RemoveContainer" containerID="41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39" Apr 22 19:06:56.979262 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.979204 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:06:56.980050 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.979944 2577 scope.go:117] "RemoveContainer" containerID="2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b" Apr 22 19:06:56.984314 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:56.984288 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:06:57.022522 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:57.022500 2577 scope.go:117] "RemoveContainer" containerID="41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39" Apr 22 19:06:57.022944 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:06:57.022878 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39\": container with ID starting with 41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39 not found: ID does not exist" containerID="41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39" Apr 22 19:06:57.022944 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:57.022918 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39"} err="failed to get container status \"41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39\": rpc error: code = NotFound desc = could not find container \"41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39\": container with ID starting with 41ca758ddb6c4a1850809704d259e6aa01e078fc868c690f790281b8a101bd39 not found: ID does not exist" Apr 22 19:06:57.022944 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:57.022944 2577 scope.go:117] "RemoveContainer" containerID="2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b" Apr 22 19:06:57.023258 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:06:57.023230 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b\": container with ID starting with 2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b not found: ID does not exist" containerID="2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b" Apr 22 19:06:57.023390 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:57.023264 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b"} err="failed to get container status \"2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b\": rpc error: code = NotFound desc = could not find container \"2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b\": container with ID starting with 2b48f7f6b3c480ab4563ec52774360ef71c5c34eb595da1de8e7f35ba9103f5b not found: ID does not exist" Apr 22 19:06:58.885262 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:06:58.885221 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aa38dbc-26cc-4873-80cb-8063021d80d9" path="/var/lib/kubelet/pods/5aa38dbc-26cc-4873-80cb-8063021d80d9/volumes" Apr 22 19:07:06.111559 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:06.111504 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:07:06.138264 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:06.138225 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:07:16.111589 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:16.111535 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:07:16.138114 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:16.138072 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:07:26.111085 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:26.111031 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:07:26.137360 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:26.137327 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:07:36.111575 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:36.111529 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:07:36.138189 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:36.138154 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:07:46.111368 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:46.111325 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:07:46.137916 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:46.137874 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:07:56.110805 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:56.110749 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:07:56.138138 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:07:56.138105 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:08:06.110931 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:06.110881 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:08:06.137408 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:06.137368 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:08:16.111634 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:16.111584 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:08:16.137664 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:16.137625 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:08:26.111437 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:26.111378 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:08:26.138361 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:26.138318 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:08:36.110603 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:36.110558 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" probeResult="failure" output="Get \"https://10.133.0.57:8001/health\": dial tcp 10.133.0.57:8001: connect: connection refused" Apr 22 19:08:36.138023 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:36.137987 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" probeResult="failure" output="Get \"https://10.133.0.58:8000/health\": dial tcp 10.133.0.58:8000: connect: connection refused" Apr 22 19:08:46.121050 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:46.121015 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:08:46.132948 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:46.132924 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:08:46.148989 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:46.148960 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:08:46.156704 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:46.156681 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:08:57.637090 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:57.636847 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n"] Apr 22 19:08:57.637776 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:57.637701 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" containerID="cri-o://5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9" gracePeriod=30 Apr 22 19:08:57.639456 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:57.639432 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh"] Apr 22 19:08:57.639854 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:08:57.639802 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" containerID="cri-o://3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb" gracePeriod=30 Apr 22 19:09:13.739143 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:13.739111 2577 ???:1] "http: TLS handshake error from 10.0.139.10:43562: EOF" Apr 22 19:09:13.742749 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:13.742710 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:13.779948 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:13.779916 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:13.787242 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:13.787214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:13.798215 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:13.798190 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:13.817087 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:13.817062 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:13.825215 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:13.825197 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:14.804141 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:14.804114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:14.831028 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:14.831001 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:14.839555 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:14.839534 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:14.850664 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:14.850641 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:14.871967 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:14.871936 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:14.884539 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:14.884523 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:15.888550 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:15.888521 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:15.913734 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:15.913687 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:15.921855 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:15.921821 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:15.932869 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:15.932847 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:15.954821 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:15.954800 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:15.963012 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:15.962994 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:16.921088 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:16.921061 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:16.945750 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:16.945710 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:16.954825 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:16.954804 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:16.965815 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:16.965783 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:16.988325 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:16.988301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:16.995614 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:16.995587 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:17.949888 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:17.949851 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:17.974961 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:17.974935 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:17.982403 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:17.982380 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:17.992786 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:17.992766 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:18.012878 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:18.012861 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:18.026096 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:18.026073 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:19.051961 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:19.051927 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:19.079880 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:19.079850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:19.087667 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:19.087646 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:19.098071 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:19.098043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:19.118580 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:19.118554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:19.124823 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:19.124804 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:20.090565 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:20.090529 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:20.116793 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:20.116746 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:20.123761 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:20.123738 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:20.137356 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:20.137313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:20.160022 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:20.159995 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:20.166349 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:20.166329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:21.117651 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:21.117605 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:21.146592 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:21.146566 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:21.158037 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:21.158018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:21.169589 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:21.169555 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:21.190025 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:21.189980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:21.196188 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:21.196167 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:22.158579 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:22.158547 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:22.183084 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:22.183059 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:22.192210 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:22.192183 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:22.202903 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:22.202880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:22.226199 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:22.226168 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:22.232889 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:22.232868 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:23.203164 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:23.203136 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:23.226987 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:23.226962 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:23.234103 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:23.234083 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:23.244065 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:23.244025 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:23.263329 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:23.263299 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:23.269783 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:23.269758 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:24.384207 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:24.384168 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:24.409664 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:24.409632 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:24.417246 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:24.417216 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:24.427397 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:24.427371 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:24.450861 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:24.450834 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:24.459095 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:24.459073 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:25.446146 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:25.446112 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:25.471257 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:25.471228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:25.481829 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:25.481800 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:25.492041 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:25.492017 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:25.511720 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:25.511696 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:25.519276 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:25.519256 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:26.469372 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:26.469331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:26.494381 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:26.494359 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:26.504899 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:26.504878 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:26.519312 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:26.519285 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:26.539869 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:26.539846 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:26.546790 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:26.546774 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:27.476940 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.476906 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-fdjqv_1bdaf5e8-f724-418d-891d-690a795024e9/istio-proxy/0.log" Apr 22 19:09:27.500132 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.500106 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:27.509882 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.509857 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/llm-d-routing-sidecar/0.log" Apr 22 19:09:27.520507 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.520486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/storage-initializer/0.log" Apr 22 19:09:27.542610 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.542588 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/main/0.log" Apr 22 19:09:27.555492 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.555469 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n_7f04d7ec-bd13-4bf8-9389-460dca58e4e0/storage-initializer/0.log" Apr 22 19:09:27.640362 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.640326 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="llm-d-routing-sidecar" containerID="cri-o://26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527" gracePeriod=2 Apr 22 19:09:27.903161 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.903133 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:09:27.917787 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.917769 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:27.918484 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:27.918465 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:09:28.013511 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013434 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-kserve-provision-location\") pod \"f8644bad-55fe-48ed-9971-5c691355194c\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " Apr 22 19:09:28.013511 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013466 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-home\") pod \"f8644bad-55fe-48ed-9971-5c691355194c\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " Apr 22 19:09:28.013511 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013482 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-model-cache\") pod \"f8644bad-55fe-48ed-9971-5c691355194c\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " Apr 22 19:09:28.013782 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013514 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl4gr\" (UniqueName: \"kubernetes.io/projected/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kube-api-access-nl4gr\") pod \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " Apr 22 19:09:28.013782 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013592 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-dshm\") pod \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " Apr 22 19:09:28.013782 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013752 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-model-cache\") pod \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " Apr 22 19:09:28.013943 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013776 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-model-cache" (OuterVolumeSpecName: "model-cache") pod "f8644bad-55fe-48ed-9971-5c691355194c" (UID: "f8644bad-55fe-48ed-9971-5c691355194c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:28.013943 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013800 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kserve-provision-location\") pod \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " Apr 22 19:09:28.013943 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013838 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8644bad-55fe-48ed-9971-5c691355194c-tls-certs\") pod \"f8644bad-55fe-48ed-9971-5c691355194c\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " Apr 22 19:09:28.013943 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013896 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-home" (OuterVolumeSpecName: "home") pod "f8644bad-55fe-48ed-9971-5c691355194c" (UID: "f8644bad-55fe-48ed-9971-5c691355194c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:28.013943 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013911 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52mdj\" (UniqueName: \"kubernetes.io/projected/f8644bad-55fe-48ed-9971-5c691355194c-kube-api-access-52mdj\") pod \"f8644bad-55fe-48ed-9971-5c691355194c\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " Apr 22 19:09:28.013943 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013940 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-home\") pod \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " Apr 22 19:09:28.014240 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.013974 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-tls-certs\") pod \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\" (UID: \"7f04d7ec-bd13-4bf8-9389-460dca58e4e0\") " Apr 22 19:09:28.014240 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.014019 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-dshm\") pod \"f8644bad-55fe-48ed-9971-5c691355194c\" (UID: \"f8644bad-55fe-48ed-9971-5c691355194c\") " Apr 22 19:09:28.014240 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.014118 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-model-cache" (OuterVolumeSpecName: "model-cache") pod "7f04d7ec-bd13-4bf8-9389-460dca58e4e0" (UID: "7f04d7ec-bd13-4bf8-9389-460dca58e4e0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:28.014398 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.014346 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-home" (OuterVolumeSpecName: "home") pod "7f04d7ec-bd13-4bf8-9389-460dca58e4e0" (UID: "7f04d7ec-bd13-4bf8-9389-460dca58e4e0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:28.014450 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.014432 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-home\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.014498 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.014455 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-model-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.014498 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.014470 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-model-cache\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.014498 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.014482 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-home\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.016375 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.016347 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-dshm" (OuterVolumeSpecName: "dshm") pod "7f04d7ec-bd13-4bf8-9389-460dca58e4e0" (UID: "7f04d7ec-bd13-4bf8-9389-460dca58e4e0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:28.016556 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.016522 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kube-api-access-nl4gr" (OuterVolumeSpecName: "kube-api-access-nl4gr") pod "7f04d7ec-bd13-4bf8-9389-460dca58e4e0" (UID: "7f04d7ec-bd13-4bf8-9389-460dca58e4e0"). InnerVolumeSpecName "kube-api-access-nl4gr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:28.016675 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.016554 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-dshm" (OuterVolumeSpecName: "dshm") pod "f8644bad-55fe-48ed-9971-5c691355194c" (UID: "f8644bad-55fe-48ed-9971-5c691355194c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:28.016675 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.016633 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8644bad-55fe-48ed-9971-5c691355194c-kube-api-access-52mdj" (OuterVolumeSpecName: "kube-api-access-52mdj") pod "f8644bad-55fe-48ed-9971-5c691355194c" (UID: "f8644bad-55fe-48ed-9971-5c691355194c"). InnerVolumeSpecName "kube-api-access-52mdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:28.016854 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.016834 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7f04d7ec-bd13-4bf8-9389-460dca58e4e0" (UID: "7f04d7ec-bd13-4bf8-9389-460dca58e4e0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:09:28.017265 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.017247 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8644bad-55fe-48ed-9971-5c691355194c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f8644bad-55fe-48ed-9971-5c691355194c" (UID: "f8644bad-55fe-48ed-9971-5c691355194c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:09:28.031780 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.031748 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7f04d7ec-bd13-4bf8-9389-460dca58e4e0" (UID: "7f04d7ec-bd13-4bf8-9389-460dca58e4e0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:28.062899 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.062874 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8644bad-55fe-48ed-9971-5c691355194c" (UID: "f8644bad-55fe-48ed-9971-5c691355194c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:28.115079 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.115053 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52mdj\" (UniqueName: \"kubernetes.io/projected/f8644bad-55fe-48ed-9971-5c691355194c-kube-api-access-52mdj\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.115079 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.115077 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.115208 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.115087 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-dshm\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.115208 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.115095 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8644bad-55fe-48ed-9971-5c691355194c-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.115208 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.115104 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nl4gr\" (UniqueName: \"kubernetes.io/projected/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kube-api-access-nl4gr\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.115208 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.115112 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-dshm\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.115208 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.115121 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f04d7ec-bd13-4bf8-9389-460dca58e4e0-kserve-provision-location\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.115208 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.115129 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8644bad-55fe-48ed-9971-5c691355194c-tls-certs\") on node \"ip-10-0-139-10.ec2.internal\" DevicePath \"\"" Apr 22 19:09:28.546784 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.546750 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-gq2nj_a2ced0b6-019b-444f-b30c-8c2a55bbe5de/discovery/0.log" Apr 22 19:09:28.561759 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.561709 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-z9w6s_00a61ee1-c541-4c2e-9b18-cca0dafd449d/istio-proxy/0.log" Apr 22 19:09:28.579021 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.579000 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-765df457d-pff85_4db562fa-e132-42e2-8767-a511fe5551aa/router/0.log" Apr 22 19:09:28.610059 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.610038 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-54cbd5b894-xl9kh_f8644bad-55fe-48ed-9971-5c691355194c/main/0.log" Apr 22 19:09:28.610583 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.610564 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8644bad-55fe-48ed-9971-5c691355194c" containerID="3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb" exitCode=137 Apr 22 19:09:28.610583 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.610581 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8644bad-55fe-48ed-9971-5c691355194c" containerID="26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527" exitCode=0 Apr 22 19:09:28.610739 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.610630 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" Apr 22 19:09:28.610739 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.610646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" event={"ID":"f8644bad-55fe-48ed-9971-5c691355194c","Type":"ContainerDied","Data":"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb"} Apr 22 19:09:28.610739 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.610675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" event={"ID":"f8644bad-55fe-48ed-9971-5c691355194c","Type":"ContainerDied","Data":"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527"} Apr 22 19:09:28.610739 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.610685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh" event={"ID":"f8644bad-55fe-48ed-9971-5c691355194c","Type":"ContainerDied","Data":"3c58b924a355e0f57455f4a8f6b7188d6bbe6729b699377c420ee1ef2c45f689"} Apr 22 19:09:28.610739 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.610701 2577 scope.go:117] "RemoveContainer" containerID="3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb" Apr 22 19:09:28.612294 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.612270 2577 generic.go:358] "Generic (PLEG): container finished" podID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerID="5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9" exitCode=137 Apr 22 19:09:28.612391 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.612307 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" event={"ID":"7f04d7ec-bd13-4bf8-9389-460dca58e4e0","Type":"ContainerDied","Data":"5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9"} Apr 22 19:09:28.612391 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.612328 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" Apr 22 19:09:28.612391 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.612335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n" event={"ID":"7f04d7ec-bd13-4bf8-9389-460dca58e4e0","Type":"ContainerDied","Data":"be5faaccfe869051984852ffccde21fd5229307714f7e12dbf9a248ac6ba4e6d"} Apr 22 19:09:28.635685 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.635665 2577 scope.go:117] "RemoveContainer" containerID="0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e" Apr 22 19:09:28.644755 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.644733 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh"] Apr 22 19:09:28.648660 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.648638 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-54cbd5b894-xl9kh"] Apr 22 19:09:28.660076 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.660054 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n"] Apr 22 19:09:28.665310 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.665287 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-77cd5c7dc9-wqn7n"] Apr 22 19:09:28.701266 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.701134 2577 scope.go:117] "RemoveContainer" containerID="26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527" Apr 22 19:09:28.709177 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.709160 2577 scope.go:117] "RemoveContainer" containerID="3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb" Apr 22 19:09:28.709427 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:09:28.709402 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb\": container with ID starting with 3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb not found: ID does not exist" containerID="3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb" Apr 22 19:09:28.709480 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.709439 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb"} err="failed to get container status \"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb\": rpc error: code = NotFound desc = could not find container \"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb\": container with ID starting with 3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb not found: ID does not exist" Apr 22 19:09:28.709480 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.709460 2577 scope.go:117] "RemoveContainer" containerID="0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e" Apr 22 19:09:28.709710 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:09:28.709691 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e\": container with ID starting with 0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e not found: ID does not exist" containerID="0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e" Apr 22 19:09:28.709824 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.709804 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e"} err="failed to get container status \"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e\": rpc error: code = NotFound desc = could not find container \"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e\": container with ID starting with 0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e not found: ID does not exist" Apr 22 19:09:28.709868 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.709826 2577 scope.go:117] "RemoveContainer" containerID="26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527" Apr 22 19:09:28.710033 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:09:28.710018 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527\": container with ID starting with 26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527 not found: ID does not exist" containerID="26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527" Apr 22 19:09:28.710075 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.710037 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527"} err="failed to get container status \"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527\": rpc error: code = NotFound desc = could not find container \"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527\": container with ID starting with 26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527 not found: ID does not exist" Apr 22 19:09:28.710075 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.710052 2577 scope.go:117] "RemoveContainer" containerID="3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb" Apr 22 19:09:28.710259 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.710242 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb"} err="failed to get container status \"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb\": rpc error: code = NotFound desc = could not find container \"3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb\": container with ID starting with 3c7ab8fcb70ffd8e9943c86597e871de9c7a121db67614f36256555923830ebb not found: ID does not exist" Apr 22 19:09:28.710307 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.710260 2577 scope.go:117] "RemoveContainer" containerID="0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e" Apr 22 19:09:28.710463 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.710445 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e"} err="failed to get container status \"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e\": rpc error: code = NotFound desc = could not find container \"0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e\": container with ID starting with 0d4ea42cbd2303069b200104789c9c5583c466e1b5550d695e779c3529830b9e not found: ID does not exist" Apr 22 19:09:28.710517 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.710462 2577 scope.go:117] "RemoveContainer" containerID="26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527" Apr 22 19:09:28.710656 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.710639 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527"} err="failed to get container status \"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527\": rpc error: code = NotFound desc = could not find container \"26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527\": container with ID starting with 26f305e0cef1512c390570b83c2b75fed377004fbc640502dfc027ae1cea5527 not found: ID does not exist" Apr 22 19:09:28.710698 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.710656 2577 scope.go:117] "RemoveContainer" containerID="5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9" Apr 22 19:09:28.730046 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.730032 2577 scope.go:117] "RemoveContainer" containerID="dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3" Apr 22 19:09:28.760827 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.760810 2577 scope.go:117] "RemoveContainer" containerID="5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9" Apr 22 19:09:28.761095 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:09:28.761075 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9\": container with ID starting with 5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9 not found: ID does not exist" containerID="5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9" Apr 22 19:09:28.761158 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.761104 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9"} err="failed to get container status \"5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9\": rpc error: code = NotFound desc = could not find container \"5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9\": container with ID starting with 5cd64e28e6f0d6fe20bc17e413cf317184107dcf691422cb0c94e80acff058d9 not found: ID does not exist" Apr 22 19:09:28.761158 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.761120 2577 scope.go:117] "RemoveContainer" containerID="dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3" Apr 22 19:09:28.761360 ip-10-0-139-10 kubenswrapper[2577]: E0422 19:09:28.761334 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3\": container with ID starting with dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3 not found: ID does not exist" containerID="dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3" Apr 22 19:09:28.761403 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.761365 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3"} err="failed to get container status \"dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3\": rpc error: code = NotFound desc = could not find container \"dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3\": container with ID starting with dd840a03c4150ece03a96ae0a245fa34b4e6c193dae9726b43411d8951d116b3 not found: ID does not exist" Apr 22 19:09:28.884825 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.884746 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" path="/var/lib/kubelet/pods/7f04d7ec-bd13-4bf8-9389-460dca58e4e0/volumes" Apr 22 19:09:28.885260 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:28.885239 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8644bad-55fe-48ed-9971-5c691355194c" path="/var/lib/kubelet/pods/f8644bad-55fe-48ed-9971-5c691355194c/volumes" Apr 22 19:09:29.397458 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:29.397424 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-gq2nj_a2ced0b6-019b-444f-b30c-8c2a55bbe5de/discovery/0.log" Apr 22 19:09:29.416094 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:29.416067 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-z9w6s_00a61ee1-c541-4c2e-9b18-cca0dafd449d/istio-proxy/0.log" Apr 22 19:09:29.434401 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:29.434378 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-765df457d-pff85_4db562fa-e132-42e2-8767-a511fe5551aa/router/0.log" Apr 22 19:09:30.222766 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:30.222732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-bfb4t_f12f3369-dec0-4542-bbaa-83a886a9fb9f/manager/0.log" Apr 22 19:09:30.337589 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:30.337561 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-nfsvg_5e57758f-365e-4543-882d-4a85198dc967/manager/0.log" Apr 22 19:09:36.141223 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:36.141188 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fgvjk_7d025095-cc31-4ade-becf-5c56f458a510/global-pull-secret-syncer/0.log" Apr 22 19:09:36.239517 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:36.239484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jpzr8_27440aa5-4698-4ea7-b7a8-ca0f7994d4e8/konnectivity-agent/0.log" Apr 22 19:09:36.336157 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:36.336128 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-10.ec2.internal_baab329e26e3fec548046283f03a6805/haproxy/0.log" Apr 22 19:09:40.428497 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:40.428463 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-bfb4t_f12f3369-dec0-4542-bbaa-83a886a9fb9f/manager/0.log" Apr 22 19:09:40.589491 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:40.589462 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-nfsvg_5e57758f-365e-4543-882d-4a85198dc967/manager/0.log" Apr 22 19:09:41.571697 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.571668 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_76660775-fc5f-4d9a-8159-93b4eb7a52d3/alertmanager/0.log" Apr 22 19:09:41.599603 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.599575 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_76660775-fc5f-4d9a-8159-93b4eb7a52d3/config-reloader/0.log" Apr 22 19:09:41.621649 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.621627 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_76660775-fc5f-4d9a-8159-93b4eb7a52d3/kube-rbac-proxy-web/0.log" Apr 22 19:09:41.642842 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.642817 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_76660775-fc5f-4d9a-8159-93b4eb7a52d3/kube-rbac-proxy/0.log" Apr 22 19:09:41.664115 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.664092 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_76660775-fc5f-4d9a-8159-93b4eb7a52d3/kube-rbac-proxy-metric/0.log" Apr 22 19:09:41.686870 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.686847 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_76660775-fc5f-4d9a-8159-93b4eb7a52d3/prom-label-proxy/0.log" Apr 22 19:09:41.710930 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.710905 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_76660775-fc5f-4d9a-8159-93b4eb7a52d3/init-config-reloader/0.log" Apr 22 19:09:41.775197 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.775173 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qqsrj_3c68dd5e-6d55-44da-b104-a82c798b9b6f/kube-state-metrics/0.log" Apr 22 19:09:41.796653 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.796631 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qqsrj_3c68dd5e-6d55-44da-b104-a82c798b9b6f/kube-rbac-proxy-main/0.log" Apr 22 19:09:41.818177 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.818158 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qqsrj_3c68dd5e-6d55-44da-b104-a82c798b9b6f/kube-rbac-proxy-self/0.log" Apr 22 19:09:41.848790 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.848733 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6c6cfdd7fb-nnqg6_8f90f953-62dc-48c5-ac04-3780fa1d00ba/metrics-server/0.log" Apr 22 19:09:41.993731 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:41.993697 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjrlm_07af08e5-baae-4347-8c0e-109a222d35de/node-exporter/0.log" Apr 22 19:09:42.014066 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:42.014042 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjrlm_07af08e5-baae-4347-8c0e-109a222d35de/kube-rbac-proxy/0.log" Apr 22 19:09:42.053185 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:42.053161 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjrlm_07af08e5-baae-4347-8c0e-109a222d35de/init-textfile/0.log" Apr 22 19:09:42.160211 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:42.160187 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-tr5s8_8d37dbf7-4750-4106-9017-8187fc45ab69/kube-rbac-proxy-main/0.log" Apr 22 19:09:42.181790 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:42.181766 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-tr5s8_8d37dbf7-4750-4106-9017-8187fc45ab69/kube-rbac-proxy-self/0.log" Apr 22 19:09:42.203844 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:42.203822 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-tr5s8_8d37dbf7-4750-4106-9017-8187fc45ab69/openshift-state-metrics/0.log" Apr 22 19:09:42.560049 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:42.560011 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-767895b6fd-9qkcg_7e29b4ae-9071-47ff-8090-f8ab3c12bd28/telemeter-client/0.log" Apr 22 19:09:42.581792 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:42.581761 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-767895b6fd-9qkcg_7e29b4ae-9071-47ff-8090-f8ab3c12bd28/reload/0.log" Apr 22 19:09:42.608007 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:42.607986 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-767895b6fd-9qkcg_7e29b4ae-9071-47ff-8090-f8ab3c12bd28/kube-rbac-proxy/0.log" Apr 22 19:09:44.024422 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:44.024394 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fvpf2_7d5bda99-21d1-4ac5-ab04-13b17c683ad1/networking-console-plugin/0.log" Apr 22 19:09:45.149607 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.149581 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75cfdbbd69-4rwhn_16e4bc48-b665-438a-944b-bed7491377b7/console/0.log" Apr 22 19:09:45.187741 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.187698 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-l4mxm_82fb16f8-4657-404e-b89a-3a3762417871/download-server/0.log" Apr 22 19:09:45.368897 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.368816 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr"] Apr 22 19:09:45.369240 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369213 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="llm-d-routing-sidecar" Apr 22 19:09:45.369240 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369234 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="llm-d-routing-sidecar" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369249 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369255 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369266 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369271 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369281 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="storage-initializer" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369286 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="storage-initializer" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369292 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="storage-initializer" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369297 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="storage-initializer" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369304 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aa38dbc-26cc-4873-80cb-8063021d80d9" containerName="storage-initializer" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369310 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa38dbc-26cc-4873-80cb-8063021d80d9" containerName="storage-initializer" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369315 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aa38dbc-26cc-4873-80cb-8063021d80d9" containerName="main" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369320 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa38dbc-26cc-4873-80cb-8063021d80d9" containerName="main" Apr 22 19:09:45.369387 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369384 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="main" Apr 22 19:09:45.369812 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369394 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8644bad-55fe-48ed-9971-5c691355194c" containerName="llm-d-routing-sidecar" Apr 22 19:09:45.369812 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369401 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f04d7ec-bd13-4bf8-9389-460dca58e4e0" containerName="main" Apr 22 19:09:45.369812 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.369408 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aa38dbc-26cc-4873-80cb-8063021d80d9" containerName="main" Apr 22 19:09:45.373982 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.373967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.376290 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.376268 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nmkll\"/\"default-dockercfg-vxhkm\"" Apr 22 19:09:45.376600 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.376581 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nmkll\"/\"kube-root-ca.crt\"" Apr 22 19:09:45.376681 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.376660 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nmkll\"/\"openshift-service-ca.crt\"" Apr 22 19:09:45.381732 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.381687 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr"] Apr 22 19:09:45.470014 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.469983 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-podres\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.470014 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.470021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-proc\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.470211 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.470044 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xq57\" (UniqueName: \"kubernetes.io/projected/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-kube-api-access-7xq57\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.470211 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.470074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-lib-modules\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.470211 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.470127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-sys\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571197 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-sys\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571367 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-podres\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571367 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-proc\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571367 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xq57\" (UniqueName: \"kubernetes.io/projected/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-kube-api-access-7xq57\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571367 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-lib-modules\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571367 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-sys\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571568 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-proc\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571568 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-lib-modules\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.571568 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.571441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-podres\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.579517 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.579490 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xq57\" (UniqueName: \"kubernetes.io/projected/a1d7463f-965c-4957-bd0f-fbefae0bcf4d-kube-api-access-7xq57\") pod \"perf-node-gather-daemonset-wnhcr\" (UID: \"a1d7463f-965c-4957-bd0f-fbefae0bcf4d\") " pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:45.691938 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:45.691911 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:46.022926 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.022898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr"] Apr 22 19:09:46.024510 ip-10-0-139-10 kubenswrapper[2577]: W0422 19:09:46.024487 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda1d7463f_965c_4957_bd0f_fbefae0bcf4d.slice/crio-49e971313726609ac7afdf96b863d67eb77abe848bb5a18e6e6b428f78c7f114 WatchSource:0}: Error finding container 49e971313726609ac7afdf96b863d67eb77abe848bb5a18e6e6b428f78c7f114: Status 404 returned error can't find the container with id 49e971313726609ac7afdf96b863d67eb77abe848bb5a18e6e6b428f78c7f114 Apr 22 19:09:46.026302 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.026286 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:09:46.469640 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.469612 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-khmjv_7bae8419-18dc-4dd7-a71d-bacb197e4c26/dns/0.log" Apr 22 19:09:46.490951 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.490931 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-khmjv_7bae8419-18dc-4dd7-a71d-bacb197e4c26/kube-rbac-proxy/0.log" Apr 22 19:09:46.566918 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.566891 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-642nb_f72f07da-c956-44a1-91e4-efb83a4ae9fc/dns-node-resolver/0.log" Apr 22 19:09:46.686494 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.686462 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" event={"ID":"a1d7463f-965c-4957-bd0f-fbefae0bcf4d","Type":"ContainerStarted","Data":"ada550b1db62ac5bd7fb32bba73a015919993bb0de9d8bab3d240b8e7cb5d724"} Apr 22 19:09:46.686628 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.686500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" event={"ID":"a1d7463f-965c-4957-bd0f-fbefae0bcf4d","Type":"ContainerStarted","Data":"49e971313726609ac7afdf96b863d67eb77abe848bb5a18e6e6b428f78c7f114"} Apr 22 19:09:46.686628 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.686576 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:46.703651 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:46.703608 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" podStartSLOduration=1.703595499 podStartE2EDuration="1.703595499s" podCreationTimestamp="2026-04-22 19:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:09:46.701201925 +0000 UTC m=+1922.339727050" watchObservedRunningTime="2026-04-22 19:09:46.703595499 +0000 UTC m=+1922.342120621" Apr 22 19:09:47.073576 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:47.073547 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7fd45d456b-kdh4x_7f399e35-f80e-49ff-9647-c3b6fbfa8e04/registry/0.log" Apr 22 19:09:47.094976 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:47.094948 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-648xg_4f6716c1-8454-4bb0-a15d-144eeaa62e20/node-ca/0.log" Apr 22 19:09:48.002352 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:48.002321 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-gq2nj_a2ced0b6-019b-444f-b30c-8c2a55bbe5de/discovery/0.log" Apr 22 19:09:48.026382 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:48.026361 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-z9w6s_00a61ee1-c541-4c2e-9b18-cca0dafd449d/istio-proxy/0.log" Apr 22 19:09:48.050040 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:48.050014 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-765df457d-pff85_4db562fa-e132-42e2-8767-a511fe5551aa/router/0.log" Apr 22 19:09:48.511788 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:48.511761 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-psh89_41ba337b-6571-4648-95a1-73c5f1faa37f/serve-healthcheck-canary/0.log" Apr 22 19:09:49.031381 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:49.031347 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gsnxt_ffeaef2c-e524-4aff-b6b1-3a7e61159f09/insights-operator/1.log" Apr 22 19:09:49.031956 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:49.031851 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gsnxt_ffeaef2c-e524-4aff-b6b1-3a7e61159f09/insights-operator/0.log" Apr 22 19:09:49.053999 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:49.053971 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6dq6k_f1033c20-6786-4aff-89a9-a9bf1b0c3ed8/kube-rbac-proxy/0.log" Apr 22 19:09:49.075180 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:49.075160 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6dq6k_f1033c20-6786-4aff-89a9-a9bf1b0c3ed8/exporter/0.log" Apr 22 19:09:49.096325 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:49.096301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6dq6k_f1033c20-6786-4aff-89a9-a9bf1b0c3ed8/extractor/0.log" Apr 22 19:09:52.379220 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:52.379190 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-d9c56dd68-5sq2t_ce40ac3a-16fb-4e00-8be5-c55e202bd2d4/manager/0.log" Apr 22 19:09:52.703109 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:52.703081 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nmkll/perf-node-gather-daemonset-wnhcr" Apr 22 19:09:52.798724 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:52.798681 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-hn5dl_6cc591d9-7303-4604-8c02-f07fa71d9e40/manager/0.log" Apr 22 19:09:52.848541 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:52.848507 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-l9gzq_cee16190-fc1d-4bc1-a275-a6c0fe5e8563/seaweedfs/0.log" Apr 22 19:09:59.424223 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.424188 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fcffm_b4af3b82-6cae-4792-8fe7-cf2daed473d1/kube-multus-additional-cni-plugins/0.log" Apr 22 19:09:59.447473 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.447441 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fcffm_b4af3b82-6cae-4792-8fe7-cf2daed473d1/egress-router-binary-copy/0.log" Apr 22 19:09:59.469142 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.469117 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fcffm_b4af3b82-6cae-4792-8fe7-cf2daed473d1/cni-plugins/0.log" Apr 22 19:09:59.491752 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.491733 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fcffm_b4af3b82-6cae-4792-8fe7-cf2daed473d1/bond-cni-plugin/0.log" Apr 22 19:09:59.513427 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.513406 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fcffm_b4af3b82-6cae-4792-8fe7-cf2daed473d1/routeoverride-cni/0.log" Apr 22 19:09:59.535301 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.535246 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fcffm_b4af3b82-6cae-4792-8fe7-cf2daed473d1/whereabouts-cni-bincopy/0.log" Apr 22 19:09:59.559699 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.559676 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fcffm_b4af3b82-6cae-4792-8fe7-cf2daed473d1/whereabouts-cni/0.log" Apr 22 19:09:59.662123 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.662099 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pr4sw_556c4304-c27e-49e6-9289-7b8986ec176b/kube-multus/0.log" Apr 22 19:09:59.774273 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.774205 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cqf5t_e3bffc40-492a-471a-83d2-c9bd203d82a8/network-metrics-daemon/0.log" Apr 22 19:09:59.792007 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:09:59.791981 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cqf5t_e3bffc40-492a-471a-83d2-c9bd203d82a8/kube-rbac-proxy/0.log" Apr 22 19:10:00.593388 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:00.593357 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bm5qp_bfc4f516-12a3-4b4c-948f-d21348678585/ovn-controller/0.log" Apr 22 19:10:00.628066 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:00.628040 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bm5qp_bfc4f516-12a3-4b4c-948f-d21348678585/ovn-acl-logging/0.log" Apr 22 19:10:00.648763 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:00.648738 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bm5qp_bfc4f516-12a3-4b4c-948f-d21348678585/kube-rbac-proxy-node/0.log" Apr 22 19:10:00.670578 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:00.670559 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bm5qp_bfc4f516-12a3-4b4c-948f-d21348678585/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:10:00.691267 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:00.691245 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bm5qp_bfc4f516-12a3-4b4c-948f-d21348678585/northd/0.log" Apr 22 19:10:00.713276 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:00.713254 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bm5qp_bfc4f516-12a3-4b4c-948f-d21348678585/nbdb/0.log" Apr 22 19:10:00.737474 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:00.737453 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bm5qp_bfc4f516-12a3-4b4c-948f-d21348678585/sbdb/0.log" Apr 22 19:10:00.911359 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:00.911330 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bm5qp_bfc4f516-12a3-4b4c-948f-d21348678585/ovnkube-controller/0.log" Apr 22 19:10:02.684250 ip-10-0-139-10 kubenswrapper[2577]: I0422 19:10:02.684223 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jkhtl_c50ef4af-f64b-4608-b9e7-126d66048d98/network-check-target-container/0.log"