Apr 23 17:41:31.954013 ip-10-0-138-68 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:41:32.393746 ip-10-0-138-68 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:32.393746 ip-10-0-138-68 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:41:32.393746 ip-10-0-138-68 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:32.393746 ip-10-0-138-68 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:41:32.393746 ip-10-0-138-68 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:41:32.394961 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.394872 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:41:32.399901 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399877 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:32.399901 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399896 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:32.399901 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399901 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:32.399901 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399904 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:32.399901 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399907 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399911 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399914 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399917 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399920 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399925 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399929 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399932 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399935 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399939 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399942 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399944 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399947 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399950 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399952 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399955 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399957 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399967 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399970 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:32.400103 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399973 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399975 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399978 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399980 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399983 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399985 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399988 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399991 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399994 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.399997 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400000 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400003 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400005 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400008 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400010 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400013 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400016 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400019 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400021 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400024 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:32.400638 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400027 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400029 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400032 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400035 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400038 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400040 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400043 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400045 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400048 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400051 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400053 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400055 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400058 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400061 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400063 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400066 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400068 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400070 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400073 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:32.401194 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400076 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400078 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400080 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400083 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400086 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400088 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400090 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400093 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400096 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400099 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400103 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400106 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400109 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400111 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400116 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400119 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400125 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400128 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400131 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400134 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:32.401657 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400137 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400140 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400142 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400145 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400557 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400563 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400566 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400569 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400573 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400575 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400578 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400582 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400585 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400587 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400590 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400593 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400595 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400598 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400600 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400603 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:32.402151 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400606 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400609 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400612 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400615 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400618 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400620 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400623 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400626 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400629 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400632 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400634 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400637 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400639 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400642 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400645 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400647 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400650 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400653 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400656 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:32.402618 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400658 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400661 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400664 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400666 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400669 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400672 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400675 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400678 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400680 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400683 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400685 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400688 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400690 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400693 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400696 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400699 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400701 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400704 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400707 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:32.403115 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400711 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400729 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400732 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400736 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400738 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400741 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400744 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400746 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400749 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400752 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400755 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400758 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400760 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400763 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400765 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400768 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400770 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400773 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400775 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400778 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:32.403611 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400781 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400784 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400786 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400789 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400791 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400794 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400797 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400800 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400804 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400807 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400810 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.400813 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402371 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402380 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402389 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402394 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402399 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402403 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402407 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402412 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:41:32.404110 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402415 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402418 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402422 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402425 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402428 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402431 2576 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402434 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402437 2576 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402440 2576 flags.go:64] FLAG: --cloud-config="" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402443 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402446 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402450 2576 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402453 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402456 2576 flags.go:64] FLAG: --config-dir="" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402459 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402463 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402467 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402470 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402474 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402477 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402480 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402483 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402486 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402490 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402493 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:41:32.404612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402497 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402500 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402504 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402506 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402510 2576 flags.go:64] FLAG: --enable-server="true" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402513 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402519 2576 flags.go:64] FLAG: --event-burst="100" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402522 2576 flags.go:64] FLAG: --event-qps="50" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402525 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402528 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402531 2576 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402535 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402538 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402540 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402543 2576 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402546 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402549 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402552 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402555 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402559 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402562 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402564 2576 flags.go:64] FLAG: --feature-gates="" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402569 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402572 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402574 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:41:32.405237 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402578 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402581 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402584 2576 flags.go:64] FLAG: --help="false" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402587 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402591 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402593 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402596 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402600 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402603 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402606 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402609 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402612 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402615 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402618 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402621 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402624 2576 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402627 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402630 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402634 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402636 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402639 2576 flags.go:64] FLAG: --lock-file="" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402642 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402645 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402650 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:41:32.405856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402656 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402659 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402661 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402664 2576 flags.go:64] FLAG: --logging-format="text" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402667 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402670 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402674 2576 flags.go:64] FLAG: --manifest-url="" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402676 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402681 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402685 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402689 2576 flags.go:64] FLAG: --max-pods="110" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402692 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402695 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402698 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402701 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402704 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402707 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402710 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402734 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402738 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402742 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402746 2576 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402749 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:41:32.406436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402754 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402757 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402761 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402763 2576 flags.go:64] FLAG: --port="10250" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402766 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402769 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b13dd43e58a7cf92" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402772 2576 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402776 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402780 2576 flags.go:64] FLAG: --register-node="true" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402783 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402786 2576 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402789 2576 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402792 2576 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402795 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402798 2576 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402802 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402805 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402808 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402812 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402815 2576 flags.go:64] FLAG: --runonce="false" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402818 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402821 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402824 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402827 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402830 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402833 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:41:32.407000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402836 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402840 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402843 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402846 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402849 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402852 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402855 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402858 2576 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402865 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402871 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402873 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402876 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402880 2576 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402883 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402887 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402891 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402894 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402897 2576 flags.go:64] FLAG: --v="2" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402901 2576 flags.go:64] FLAG: --version="false" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402905 2576 flags.go:64] FLAG: --vmodule="" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402910 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.402913 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403003 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403007 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:32.407616 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403012 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403015 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403017 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403020 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403023 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403025 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403028 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403031 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403035 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403038 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403042 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403045 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403049 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403059 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403064 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403067 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403070 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403073 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403076 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:32.408231 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403079 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403081 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403084 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403088 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403092 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403094 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403097 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403100 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403103 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403106 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403108 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403111 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403114 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403118 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403121 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403123 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403126 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403129 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403131 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403134 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:32.408763 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403136 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403139 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403142 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403144 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403147 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403149 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403152 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403155 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403158 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403161 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403163 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403166 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403168 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403171 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403173 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403177 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403179 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403181 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403184 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403187 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:32.409259 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403189 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403191 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403194 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403196 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403199 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403202 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403205 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403207 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403210 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403212 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403215 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403217 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403219 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403222 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403224 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403227 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403230 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403232 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403235 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:32.409785 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403237 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:32.410249 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403240 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:32.410249 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403243 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:32.410249 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403246 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:32.410249 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403248 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:32.410249 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.403251 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:32.410249 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.404030 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:32.410620 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.410599 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:41:32.410655 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.410621 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:41:32.410686 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410667 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:32.410686 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410674 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:32.410686 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410679 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:32.410686 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410683 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:32.410686 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410686 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410689 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410692 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410696 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410698 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410701 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410704 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410707 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410710 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410712 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410727 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410730 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410733 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410735 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410738 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410742 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410745 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410748 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410751 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410755 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:32.410825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410759 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410761 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410764 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410767 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410770 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410772 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410775 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410778 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410781 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410784 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410787 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410790 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410792 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410795 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410798 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410801 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410804 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410807 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410809 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410812 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:32.411321 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410815 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410818 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410820 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410823 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410826 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410829 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410831 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410834 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410837 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410840 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410843 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410845 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410848 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410850 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410853 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410855 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410858 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410861 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410863 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410865 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:32.411831 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410872 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410874 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410877 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410880 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410883 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410885 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410889 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410892 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410894 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410897 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410899 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410902 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410905 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410907 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410910 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410912 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410915 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410917 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410920 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:32.412319 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410923 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410925 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.410928 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.410934 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411038 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411043 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411046 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411049 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411060 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411064 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411066 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411069 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411072 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411074 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411081 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411084 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:41:32.412874 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411086 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411089 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411092 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411095 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411098 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411101 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411103 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411106 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411109 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411111 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411114 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411117 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411119 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411122 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411124 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411127 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411129 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411132 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411134 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411137 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:41:32.413287 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411139 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411142 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411144 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411147 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411149 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411152 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411154 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411157 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411159 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411162 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411164 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411172 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411175 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411177 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411180 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411182 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411185 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411188 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411190 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411193 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:41:32.413772 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411195 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411197 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411201 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411205 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411208 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411210 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411212 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411215 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411217 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411220 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411222 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411225 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411227 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411230 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411232 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411235 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411237 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411240 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411242 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411245 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:41:32.414279 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411247 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411249 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411252 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411254 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411257 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411260 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411262 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411265 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411268 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411270 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411274 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411278 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411280 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:32.411283 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.411288 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:41:32.414783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.412003 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:41:32.415178 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.414124 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:41:32.415178 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.415105 2576 server.go:1019] "Starting client certificate rotation" Apr 23 17:41:32.415228 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.415207 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:41:32.415258 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.415245 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:41:32.441359 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.441338 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:41:32.444810 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.444792 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:41:32.459554 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.459534 2576 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:41:32.465662 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.465636 2576 log.go:25] "Validated CRI v1 image API" Apr 23 17:41:32.468086 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.468065 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:41:32.470923 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.470905 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:41:32.473126 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.473107 2576 fs.go:135] Filesystem UUIDs: map[54a81b47-7ef1-4a56-aa81-bd482a1ffcbd:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a0d3f786-366c-43de-a1f2-6ff6145f0e8c:/dev/nvme0n1p4] Apr 23 17:41:32.473196 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.473127 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:41:32.479051 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.478941 2576 manager.go:217] Machine: {Timestamp:2026-04-23 17:41:32.476932656 +0000 UTC m=+0.406945283 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099745 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20b0cd3b8a3af38880f58909f4b0e4 SystemUUID:ec20b0cd-3b8a-3af3-8880-f58909f4b0e4 BootID:864bb56e-4aa5-48d0-b963-52a91918fbd8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3c:4c:c6:78:19 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3c:4c:c6:78:19 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:92:de:59:47:e6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:41:32.479051 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.479045 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:41:32.479165 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.479137 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:41:32.480268 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.480246 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:41:32.480412 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.480270 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-68.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:41:32.480453 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.480421 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:41:32.480453 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.480432 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:41:32.480453 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.480445 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:41:32.481269 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.481254 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:41:32.482163 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.482153 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:41:32.482269 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.482261 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:41:32.484838 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.484828 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:41:32.484873 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.484843 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:41:32.484873 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.484855 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:41:32.484873 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.484864 2576 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:41:32.484873 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.484873 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:41:32.485968 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.485956 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:41:32.486008 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.485975 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:41:32.489072 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.489056 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:41:32.490416 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.490403 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:41:32.492116 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492095 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:41:32.492116 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492113 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492120 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492128 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492137 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492147 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492155 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492163 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492170 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492177 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492185 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:41:32.492202 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.492194 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:41:32.493015 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.493005 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:41:32.493056 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.493017 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:41:32.496742 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.496708 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:41:32.496845 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.496763 2576 server.go:1295] "Started kubelet" Apr 23 17:41:32.496955 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.496933 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-68.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:41:32.497056 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.497033 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-68.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:41:32.497093 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.497028 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:41:32.497123 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.497081 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:41:32.497152 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.497110 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:41:32.497199 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.497124 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:41:32.497735 ip-10-0-138-68 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:41:32.498592 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.498476 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:41:32.501035 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.501012 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:41:32.503908 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.502918 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-68.ec2.internal.18a90d3f112e99b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-68.ec2.internal,UID:ip-10-0-138-68.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-68.ec2.internal,},FirstTimestamp:2026-04-23 17:41:32.496738742 +0000 UTC m=+0.426751369,LastTimestamp:2026-04-23 17:41:32.496738742 +0000 UTC m=+0.426751369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-68.ec2.internal,}" Apr 23 17:41:32.505287 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.505265 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:41:32.506499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.505984 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:41:32.507050 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.506698 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:41:32.507867 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.507849 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:41:32.507973 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.507955 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:41:32.508068 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.507984 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:41:32.508109 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508097 2576 factory.go:55] Registering systemd factory Apr 23 17:41:32.508109 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508107 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:41:32.508170 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508114 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:41:32.508170 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508114 2576 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:41:32.508337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508326 2576 factory.go:153] Registering CRI-O factory Apr 23 17:41:32.508400 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508339 2576 factory.go:223] Registration of the crio container factory successfully Apr 23 17:41:32.508400 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508380 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:41:32.508481 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508405 2576 factory.go:103] Registering Raw factory Apr 23 17:41:32.508481 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508422 2576 manager.go:1196] Started watching for new ooms in manager Apr 23 17:41:32.508612 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.508588 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:32.508858 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.508842 2576 manager.go:319] Starting recovery of all containers Apr 23 17:41:32.509857 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.509828 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:41:32.509963 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.509928 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-68.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 17:41:32.512522 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.512344 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qqs58" Apr 23 17:41:32.517978 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.517954 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qqs58" Apr 23 17:41:32.518908 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.518895 2576 manager.go:324] Recovery completed Apr 23 17:41:32.522762 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.522749 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:32.530419 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.530402 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:32.530494 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.530431 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:32.530494 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.530442 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:32.531020 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.531000 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:41:32.531020 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.531015 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:41:32.531136 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.531031 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:41:32.534426 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.534412 2576 policy_none.go:49] "None policy: Start" Apr 23 17:41:32.534489 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.534435 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:41:32.534489 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.534445 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:41:32.576380 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.576361 2576 manager.go:341] "Starting Device Plugin manager" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.576415 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.576430 2576 server.go:85] "Starting device plugin registration server" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.576758 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.576773 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.576867 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.576953 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.576961 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.577411 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:41:32.589783 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.577454 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:32.641178 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.641134 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:41:32.642522 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.642498 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:41:32.642522 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.642527 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:41:32.642648 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.642552 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:41:32.642648 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.642560 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:41:32.642648 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.642595 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:41:32.644626 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.644571 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:32.677352 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.677328 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:32.678372 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.678356 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:32.678468 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.678386 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:32.678468 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.678397 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:32.678468 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.678419 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.686014 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.685996 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.686107 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.686021 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-68.ec2.internal\": node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:32.697057 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.697036 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:32.742962 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.742934 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal"] Apr 23 17:41:32.743074 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.743008 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:32.744660 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.744645 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:32.744757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.744674 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:32.744757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.744686 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:32.747091 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747077 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:32.747234 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.747284 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747242 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:32.747823 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747799 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:32.747906 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747832 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:32.747906 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747844 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:32.747906 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747894 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:32.748001 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747919 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:32.748001 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.747937 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:32.750650 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.750635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.750712 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.750663 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:41:32.751370 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.751349 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:41:32.751464 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.751379 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:41:32.751464 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.751390 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:41:32.775629 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.775599 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-68.ec2.internal\" not found" node="ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.780144 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.780125 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-68.ec2.internal\" not found" node="ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.797562 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.797537 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:32.810396 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.810369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.810513 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.810398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.810513 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.810418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9475ce23d467a37e0480df7597bbc574-config\") pod \"kube-apiserver-proxy-ip-10-0-138-68.ec2.internal\" (UID: \"9475ce23d467a37e0480df7597bbc574\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.898453 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.898363 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:32.910757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.910713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.910879 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.910773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9475ce23d467a37e0480df7597bbc574-config\") pod \"kube-apiserver-proxy-ip-10-0-138-68.ec2.internal\" (UID: \"9475ce23d467a37e0480df7597bbc574\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.910879 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.910818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.910879 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.910825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.910879 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.910863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25dc2f6e99d2525192843ee005a28c4f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal\" (UID: \"25dc2f6e99d2525192843ee005a28c4f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.911019 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:32.910881 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9475ce23d467a37e0480df7597bbc574-config\") pod \"kube-apiserver-proxy-ip-10-0-138-68.ec2.internal\" (UID: \"9475ce23d467a37e0480df7597bbc574\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 23 17:41:32.999157 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:32.999112 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.077649 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.077612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:33.083305 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.083287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 23 17:41:33.100103 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.100082 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.200663 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.200580 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.301147 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.301110 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.401654 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.401622 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.415109 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.415084 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:41:33.415235 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.415219 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:41:33.501852 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.501708 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.505615 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.505598 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:41:33.515933 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.515899 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:41:33.520033 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.520006 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:36:32 +0000 UTC" deadline="2027-11-15 17:53:02.085170617 +0000 UTC" Apr 23 17:41:33.520033 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.520032 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13704h11m28.565141714s" Apr 23 17:41:33.602374 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.602344 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.627684 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:33.627650 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9475ce23d467a37e0480df7597bbc574.slice/crio-2cc7de0150ab890076dbcf3021942e1feab062a7c810f1215d6eba3cec7f42e5 WatchSource:0}: Error finding container 2cc7de0150ab890076dbcf3021942e1feab062a7c810f1215d6eba3cec7f42e5: Status 404 returned error can't find the container with id 2cc7de0150ab890076dbcf3021942e1feab062a7c810f1215d6eba3cec7f42e5 Apr 23 17:41:33.627961 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:33.627937 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25dc2f6e99d2525192843ee005a28c4f.slice/crio-13cc9f3a19feec42ccb043fac1c21da95c3246250f49874ca04760fe7edd0f46 WatchSource:0}: Error finding container 13cc9f3a19feec42ccb043fac1c21da95c3246250f49874ca04760fe7edd0f46: Status 404 returned error can't find the container with id 13cc9f3a19feec42ccb043fac1c21da95c3246250f49874ca04760fe7edd0f46 Apr 23 17:41:33.633199 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.633151 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:41:33.644956 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.644901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" event={"ID":"9475ce23d467a37e0480df7597bbc574","Type":"ContainerStarted","Data":"2cc7de0150ab890076dbcf3021942e1feab062a7c810f1215d6eba3cec7f42e5"} Apr 23 17:41:33.645222 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.645206 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6c6j9" Apr 23 17:41:33.645839 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.645814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" event={"ID":"25dc2f6e99d2525192843ee005a28c4f","Type":"ContainerStarted","Data":"13cc9f3a19feec42ccb043fac1c21da95c3246250f49874ca04760fe7edd0f46"} Apr 23 17:41:33.648221 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.648205 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:33.653746 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.653727 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6c6j9" Apr 23 17:41:33.703270 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.703236 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.803913 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.803824 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:33.896602 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:33.896572 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:33.904101 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:33.904078 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:34.004915 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.004873 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-68.ec2.internal\" not found" Apr 23 17:41:34.082371 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.082299 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:34.107901 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.107873 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" Apr 23 17:41:34.121015 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.120986 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:41:34.122435 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.122160 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" Apr 23 17:41:34.129263 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.129241 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:41:34.460962 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.460900 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:41:34.486273 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.486239 2576 apiserver.go:52] "Watching apiserver" Apr 23 17:41:34.495171 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.495142 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:41:34.495543 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.495514 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-swcqx","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4","openshift-cluster-node-tuning-operator/tuned-tfs7c","openshift-dns/node-resolver-v6skv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal","openshift-multus/multus-additional-cni-plugins-5s6bg","openshift-network-operator/iptables-alerter-9psv4","kube-system/konnectivity-agent-j6hxm","kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal","openshift-image-registry/node-ca-4f55z","openshift-multus/multus-kl27l","openshift-multus/network-metrics-daemon-vjjmx","openshift-network-diagnostics/network-check-target-xtb9l"] Apr 23 17:41:34.500235 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.500210 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.502360 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.502337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.503546 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.502621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.504912 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.504713 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:41:34.504912 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.504745 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-92cd5\"" Apr 23 17:41:34.504912 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.504779 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:41:34.504912 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.504755 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:41:34.505145 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.505137 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:41:34.505612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.505358 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:41:34.506160 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.505703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.506160 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.506084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:41:34.506318 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.506092 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:41:34.506377 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.506347 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-t7mfq\"" Apr 23 17:41:34.506942 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.506916 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:41:34.507365 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.507120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-t6cwm\"" Apr 23 17:41:34.508308 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.508288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.508563 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.508543 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:41:34.508799 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.508758 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s95x9\"" Apr 23 17:41:34.509264 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.509246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:41:34.510703 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.510592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.510703 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.510608 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:41:34.513020 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.512925 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:34.514968 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.514948 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:41:34.515307 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.515280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.516374 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516355 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pm77z\"" Apr 23 17:41:34.516495 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516377 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:41:34.516495 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516409 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:41:34.516495 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516440 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:41:34.516648 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516500 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-kbqfn\"" Apr 23 17:41:34.516648 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:41:34.516648 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516357 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:41:34.516893 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516873 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:41:34.516986 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516948 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:41:34.516986 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.516963 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:41:34.517221 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.517168 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wtfc9\"" Apr 23 17:41:34.517221 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.517206 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:41:34.517837 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.517819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.518311 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.518292 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:41:34.518395 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.518338 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:41:34.518395 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.518354 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:41:34.518395 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.518363 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:41:34.518395 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.518292 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:41:34.518579 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.518338 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ktjg4\"" Apr 23 17:41:34.519978 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.519959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-cnibin\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.520079 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.519986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-cni-binary-copy\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.520079 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.520079 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-kubernetes\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.520079 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.520283 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysconfig\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.520283 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysctl-conf\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.520283 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-system-cni-dir\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.520283 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-tuned\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.520283 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/303a91cd-3950-49c3-bba5-e1970d19eb67-tmp\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.520283 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2928f4e6-28bf-471f-bb81-513b3e161d32-tmp-dir\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-os-release\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szq8k\" (UniqueName: \"kubernetes.io/projected/91e1f83d-4f6d-434e-b876-d8ab02848d17-kube-api-access-szq8k\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzjn\" (UniqueName: \"kubernetes.io/projected/c531f58f-450b-4518-9e3d-0be09c2473b9-kube-api-access-mqzjn\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-device-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-sys-fs\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-modprobe-d\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.520564 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-run\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5thmq\" (UniqueName: \"kubernetes.io/projected/2928f4e6-28bf-471f-bb81-513b3e161d32-kube-api-access-5thmq\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c531f58f-450b-4518-9e3d-0be09c2473b9-iptables-alerter-script\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-socket-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.520453 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520682 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-systemd\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-sys\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-var-lib-kubelet\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c531f58f-450b-4518-9e3d-0be09c2473b9-host-slash\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-registration-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysctl-d\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520856 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-lib-modules\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520916 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cf5qv\"" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcbdg\" (UniqueName: \"kubernetes.io/projected/303a91cd-3950-49c3-bba5-e1970d19eb67-kube-api-access-bcbdg\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2928f4e6-28bf-471f-bb81-513b3e161d32-hosts-file\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.520995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bm5s\" (UniqueName: \"kubernetes.io/projected/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-kube-api-access-6bm5s\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.521087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.521018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-host\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.522833 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.522815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:34.522927 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.522883 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:34.609828 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.609796 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:41:34.621693 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-cnibin\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.621693 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-cni-binary-copy\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-kubelet\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-cni-bin\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-cnibin\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysconfig\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysconfig\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.621911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-slash\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-system-cni-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-cni-bin\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.621990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/303a91cd-3950-49c3-bba5-e1970d19eb67-tmp\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2928f4e6-28bf-471f-bb81-513b3e161d32-tmp-dir\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-run-netns\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-env-overrides\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c193e30-8c0e-422b-be31-7daf50d7aeb1-cni-binary-copy\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-run-ovn-kubernetes\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b659406-d1b9-4f3d-86f2-68515038c182-ovn-node-metrics-cert\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a1cf606-e60a-4909-8878-950353a863cc-konnectivity-ca\") pod \"konnectivity-agent-j6hxm\" (UID: \"4a1cf606-e60a-4909-8878-950353a863cc\") " pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brg9h\" (UniqueName: \"kubernetes.io/projected/19ad9566-830f-4ba3-bed2-db16fce5cd6a-kube-api-access-brg9h\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-cni-binary-copy\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.622334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622324 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2928f4e6-28bf-471f-bb81-513b3e161d32-tmp-dir\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-device-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-modprobe-d\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-device-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-run\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5thmq\" (UniqueName: \"kubernetes.io/projected/2928f4e6-28bf-471f-bb81-513b3e161d32-kube-api-access-5thmq\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-cni-netd\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-cnibin\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-socket-dir-parent\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-modprobe-d\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-run\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c531f58f-450b-4518-9e3d-0be09c2473b9-iptables-alerter-script\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-socket-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-systemd\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-sys\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/91e1f83d-4f6d-434e-b876-d8ab02848d17-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-var-lib-kubelet\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-systemd\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622835 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-socket-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-var-lib-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-sys\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-ovnkube-script-lib\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-var-lib-kubelet\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a1cf606-e60a-4909-8878-950353a863cc-agent-certs\") pod \"konnectivity-agent-j6hxm\" (UID: \"4a1cf606-e60a-4909-8878-950353a863cc\") " pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c531f58f-450b-4518-9e3d-0be09c2473b9-host-slash\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-registration-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysctl-d\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.622998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c531f58f-450b-4518-9e3d-0be09c2473b9-host-slash\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-lib-modules\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-registration-dir\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-cni-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.623710 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-conf-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-host\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysctl-d\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-cni-multus\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-lib-modules\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-hostroot\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-host\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c531f58f-450b-4518-9e3d-0be09c2473b9-iptables-alerter-script\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-kubernetes\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-kubernetes\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9rbv\" (UniqueName: \"kubernetes.io/projected/1b961de5-fea1-4bac-9c17-d8682d9a4242-kube-api-access-z9rbv\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-daemon-config\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-etc-kubernetes\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysctl-conf\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-systemd\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-netns\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-sysctl-conf\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-multus-certs\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-system-cni-dir\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-tuned\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-os-release\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-os-release\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szq8k\" (UniqueName: \"kubernetes.io/projected/91e1f83d-4f6d-434e-b876-d8ab02848d17-kube-api-access-szq8k\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-ovnkube-config\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-os-release\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19ad9566-830f-4ba3-bed2-db16fce5cd6a-host\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-kubelet\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91e1f83d-4f6d-434e-b876-d8ab02848d17-system-cni-dir\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzjn\" (UniqueName: \"kubernetes.io/projected/c531f58f-450b-4518-9e3d-0be09c2473b9-kube-api-access-mqzjn\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-sys-fs\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-node-log\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.624998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-systemd-units\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-sys-fs\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-ovn\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-k8s-cni-cncf-io\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.623968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcbdg\" (UniqueName: \"kubernetes.io/projected/303a91cd-3950-49c3-bba5-e1970d19eb67-kube-api-access-bcbdg\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2928f4e6-28bf-471f-bb81-513b3e161d32-hosts-file\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-etc-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2928f4e6-28bf-471f-bb81-513b3e161d32-hosts-file\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-log-socket\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624187 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bm5s\" (UniqueName: \"kubernetes.io/projected/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-kube-api-access-6bm5s\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzwvf\" (UniqueName: \"kubernetes.io/projected/4b659406-d1b9-4f3d-86f2-68515038c182-kube-api-access-zzwvf\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/19ad9566-830f-4ba3-bed2-db16fce5cd6a-serviceca\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.625575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.624303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nks2b\" (UniqueName: \"kubernetes.io/projected/3c193e30-8c0e-422b-be31-7daf50d7aeb1-kube-api-access-nks2b\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.626073 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.625781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/303a91cd-3950-49c3-bba5-e1970d19eb67-tmp\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.626073 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.625818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/303a91cd-3950-49c3-bba5-e1970d19eb67-etc-tuned\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.640137 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.640109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzjn\" (UniqueName: \"kubernetes.io/projected/c531f58f-450b-4518-9e3d-0be09c2473b9-kube-api-access-mqzjn\") pod \"iptables-alerter-9psv4\" (UID: \"c531f58f-450b-4518-9e3d-0be09c2473b9\") " pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.640239 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.640137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5thmq\" (UniqueName: \"kubernetes.io/projected/2928f4e6-28bf-471f-bb81-513b3e161d32-kube-api-access-5thmq\") pod \"node-resolver-v6skv\" (UID: \"2928f4e6-28bf-471f-bb81-513b3e161d32\") " pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.640239 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.640204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcbdg\" (UniqueName: \"kubernetes.io/projected/303a91cd-3950-49c3-bba5-e1970d19eb67-kube-api-access-bcbdg\") pod \"tuned-tfs7c\" (UID: \"303a91cd-3950-49c3-bba5-e1970d19eb67\") " pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.640982 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.640958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szq8k\" (UniqueName: \"kubernetes.io/projected/91e1f83d-4f6d-434e-b876-d8ab02848d17-kube-api-access-szq8k\") pod \"multus-additional-cni-plugins-5s6bg\" (UID: \"91e1f83d-4f6d-434e-b876-d8ab02848d17\") " pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.654601 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.654573 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:36:33 +0000 UTC" deadline="2027-12-16 17:02:49.966544999 +0000 UTC" Apr 23 17:41:34.654601 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.654598 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14447h21m15.311949905s" Apr 23 17:41:34.660609 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.660581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bm5s\" (UniqueName: \"kubernetes.io/projected/f1d4eda8-73aa-4336-8bc4-dd2b15195cac-kube-api-access-6bm5s\") pod \"aws-ebs-csi-driver-node-9v7r4\" (UID: \"f1d4eda8-73aa-4336-8bc4-dd2b15195cac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.725280 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzwvf\" (UniqueName: \"kubernetes.io/projected/4b659406-d1b9-4f3d-86f2-68515038c182-kube-api-access-zzwvf\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725280 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/19ad9566-830f-4ba3-bed2-db16fce5cd6a-serviceca\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.725280 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nks2b\" (UniqueName: \"kubernetes.io/projected/3c193e30-8c0e-422b-be31-7daf50d7aeb1-kube-api-access-nks2b\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-kubelet\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-cni-bin\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-slash\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-system-cni-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-kubelet\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-cni-bin\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-system-cni-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-slash\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-cni-bin\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-run-netns\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725549 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-cni-bin\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-run-netns\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-env-overrides\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c193e30-8c0e-422b-be31-7daf50d7aeb1-cni-binary-copy\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725647 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-run-ovn-kubernetes\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b659406-d1b9-4f3d-86f2-68515038c182-ovn-node-metrics-cert\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/19ad9566-830f-4ba3-bed2-db16fce5cd6a-serviceca\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-run-ovn-kubernetes\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a1cf606-e60a-4909-8878-950353a863cc-konnectivity-ca\") pod \"konnectivity-agent-j6hxm\" (UID: \"4a1cf606-e60a-4909-8878-950353a863cc\") " pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:34.725952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brg9h\" (UniqueName: \"kubernetes.io/projected/19ad9566-830f-4ba3-bed2-db16fce5cd6a-kube-api-access-brg9h\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-cni-netd\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.725993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-cnibin\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-socket-dir-parent\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-var-lib-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-ovnkube-script-lib\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-cnibin\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a1cf606-e60a-4909-8878-950353a863cc-agent-certs\") pod \"konnectivity-agent-j6hxm\" (UID: \"4a1cf606-e60a-4909-8878-950353a863cc\") " pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-env-overrides\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-socket-dir-parent\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-cni-netd\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.726243 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-cni-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c193e30-8c0e-422b-be31-7daf50d7aeb1-cni-binary-copy\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-conf-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-cni-multus\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.726333 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:35.226301852 +0000 UTC m=+3.156314484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:34.726375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-cni-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-cni-multus\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-hostroot\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726381 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-conf-dir\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-var-lib-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9rbv\" (UniqueName: \"kubernetes.io/projected/1b961de5-fea1-4bac-9c17-d8682d9a4242-kube-api-access-z9rbv\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-daemon-config\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-etc-kubernetes\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-hostroot\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-systemd\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-netns\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-multus-certs\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4a1cf606-e60a-4909-8878-950353a863cc-konnectivity-ca\") pod \"konnectivity-agent-j6hxm\" (UID: \"4a1cf606-e60a-4909-8878-950353a863cc\") " pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-os-release\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-ovnkube-config\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-systemd\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19ad9566-830f-4ba3-bed2-db16fce5cd6a-host\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.727499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-kubelet\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-etc-kubernetes\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-ovnkube-script-lib\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-node-log\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-systemd-units\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-netns\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-multus-certs\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-ovn\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-k8s-cni-cncf-io\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-os-release\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-etc-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19ad9566-830f-4ba3-bed2-db16fce5cd6a-host\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-var-lib-kubelet\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-node-log\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-ovn\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-etc-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c193e30-8c0e-422b-be31-7daf50d7aeb1-host-run-k8s-cni-cncf-io\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728061 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-log-socket\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.727026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-run-openvswitch\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.726991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-systemd-units\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.727069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-log-socket\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.727110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c193e30-8c0e-422b-be31-7daf50d7aeb1-multus-daemon-config\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.727114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b659406-d1b9-4f3d-86f2-68515038c182-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.727200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b659406-d1b9-4f3d-86f2-68515038c182-ovnkube-config\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.728196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b659406-d1b9-4f3d-86f2-68515038c182-ovn-node-metrics-cert\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.728596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.728370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4a1cf606-e60a-4909-8878-950353a863cc-agent-certs\") pod \"konnectivity-agent-j6hxm\" (UID: \"4a1cf606-e60a-4909-8878-950353a863cc\") " pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:34.734662 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.734630 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:34.734662 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.734654 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:34.734833 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.734668 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jjt8p for pod openshift-network-diagnostics/network-check-target-xtb9l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:34.734833 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:34.734760 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p podName:a73a83b1-557f-48ee-895c-c53fd945675b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:35.234739553 +0000 UTC m=+3.164752399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jjt8p" (UniqueName: "kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p") pod "network-check-target-xtb9l" (UID: "a73a83b1-557f-48ee-895c-c53fd945675b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:34.735190 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.735172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brg9h\" (UniqueName: \"kubernetes.io/projected/19ad9566-830f-4ba3-bed2-db16fce5cd6a-kube-api-access-brg9h\") pod \"node-ca-4f55z\" (UID: \"19ad9566-830f-4ba3-bed2-db16fce5cd6a\") " pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.735439 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.735419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nks2b\" (UniqueName: \"kubernetes.io/projected/3c193e30-8c0e-422b-be31-7daf50d7aeb1-kube-api-access-nks2b\") pod \"multus-kl27l\" (UID: \"3c193e30-8c0e-422b-be31-7daf50d7aeb1\") " pod="openshift-multus/multus-kl27l" Apr 23 17:41:34.735536 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.735476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzwvf\" (UniqueName: \"kubernetes.io/projected/4b659406-d1b9-4f3d-86f2-68515038c182-kube-api-access-zzwvf\") pod \"ovnkube-node-swcqx\" (UID: \"4b659406-d1b9-4f3d-86f2-68515038c182\") " pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.735585 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.735569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9rbv\" (UniqueName: \"kubernetes.io/projected/1b961de5-fea1-4bac-9c17-d8682d9a4242-kube-api-access-z9rbv\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:34.814027 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.813992 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9psv4" Apr 23 17:41:34.820752 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.820729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" Apr 23 17:41:34.830539 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.830517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" Apr 23 17:41:34.834107 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.834089 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v6skv" Apr 23 17:41:34.840689 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.840656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" Apr 23 17:41:34.848292 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.848271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:34.854822 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.854799 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:34.862374 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.862347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4f55z" Apr 23 17:41:34.866912 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:34.866894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kl27l" Apr 23 17:41:35.230349 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.230297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:35.230532 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:35.230436 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:35.230532 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:35.230510 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:36.230487705 +0000 UTC m=+4.160500320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:35.316304 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:35.316236 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2928f4e6_28bf_471f_bb81_513b3e161d32.slice/crio-028970e32d090780abae4f5787dbfb597cdde26a64276f6c9f6d68a6c63292d6 WatchSource:0}: Error finding container 028970e32d090780abae4f5787dbfb597cdde26a64276f6c9f6d68a6c63292d6: Status 404 returned error can't find the container with id 028970e32d090780abae4f5787dbfb597cdde26a64276f6c9f6d68a6c63292d6 Apr 23 17:41:35.317702 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:35.317640 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91e1f83d_4f6d_434e_b876_d8ab02848d17.slice/crio-29e84cdbd8e91b286a5047b1ed5ebda009b369cbe4e62df8aa4e6160dd3c4091 WatchSource:0}: Error finding container 29e84cdbd8e91b286a5047b1ed5ebda009b369cbe4e62df8aa4e6160dd3c4091: Status 404 returned error can't find the container with id 29e84cdbd8e91b286a5047b1ed5ebda009b369cbe4e62df8aa4e6160dd3c4091 Apr 23 17:41:35.321190 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:35.321169 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303a91cd_3950_49c3_bba5_e1970d19eb67.slice/crio-6bb3daf9e39cd9333199f2f2732fd20583cb9df43244c904394ec4db620210f6 WatchSource:0}: Error finding container 6bb3daf9e39cd9333199f2f2732fd20583cb9df43244c904394ec4db620210f6: Status 404 returned error can't find the container with id 6bb3daf9e39cd9333199f2f2732fd20583cb9df43244c904394ec4db620210f6 Apr 23 17:41:35.321771 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:35.321665 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a1cf606_e60a_4909_8878_950353a863cc.slice/crio-da9bb1e5e17d2362b136c0dacdf0faaa840a912ee2ce7890e87c6b70aba3ff23 WatchSource:0}: Error finding container da9bb1e5e17d2362b136c0dacdf0faaa840a912ee2ce7890e87c6b70aba3ff23: Status 404 returned error can't find the container with id da9bb1e5e17d2362b136c0dacdf0faaa840a912ee2ce7890e87c6b70aba3ff23 Apr 23 17:41:35.322572 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:35.322547 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b659406_d1b9_4f3d_86f2_68515038c182.slice/crio-130744ef5782d770d1c24c997a4666c10501613fbc8e975c948174cd06568ce1 WatchSource:0}: Error finding container 130744ef5782d770d1c24c997a4666c10501613fbc8e975c948174cd06568ce1: Status 404 returned error can't find the container with id 130744ef5782d770d1c24c997a4666c10501613fbc8e975c948174cd06568ce1 Apr 23 17:41:35.324024 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:35.324000 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc531f58f_450b_4518_9e3d_0be09c2473b9.slice/crio-7b67ddc184a1ef2fdec254c39fa9927b1c0461e55552ce7e6d73e4baac96a227 WatchSource:0}: Error finding container 7b67ddc184a1ef2fdec254c39fa9927b1c0461e55552ce7e6d73e4baac96a227: Status 404 returned error can't find the container with id 7b67ddc184a1ef2fdec254c39fa9927b1c0461e55552ce7e6d73e4baac96a227 Apr 23 17:41:35.325300 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:35.325142 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c193e30_8c0e_422b_be31_7daf50d7aeb1.slice/crio-8c601cb7db6d58488c9c4560e7e1aa46027731098c6f70d0c88b32eed0be5171 WatchSource:0}: Error finding container 8c601cb7db6d58488c9c4560e7e1aa46027731098c6f70d0c88b32eed0be5171: Status 404 returned error can't find the container with id 8c601cb7db6d58488c9c4560e7e1aa46027731098c6f70d0c88b32eed0be5171 Apr 23 17:41:35.327899 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:41:35.326904 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d4eda8_73aa_4336_8bc4_dd2b15195cac.slice/crio-fbeb14b6f1ac09c952f6143a68ebcc938032d94a07e6166d5f7584bff453f35d WatchSource:0}: Error finding container fbeb14b6f1ac09c952f6143a68ebcc938032d94a07e6166d5f7584bff453f35d: Status 404 returned error can't find the container with id fbeb14b6f1ac09c952f6143a68ebcc938032d94a07e6166d5f7584bff453f35d Apr 23 17:41:35.330707 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.330684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:35.330821 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:35.330807 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:35.330878 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:35.330825 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:35.330878 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:35.330835 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jjt8p for pod openshift-network-diagnostics/network-check-target-xtb9l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:35.330878 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:35.330877 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p podName:a73a83b1-557f-48ee-895c-c53fd945675b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:36.330861313 +0000 UTC m=+4.260873944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jjt8p" (UniqueName: "kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p") pod "network-check-target-xtb9l" (UID: "a73a83b1-557f-48ee-895c-c53fd945675b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:35.643857 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.643587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:35.644226 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:35.643897 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:35.650381 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.650347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"130744ef5782d770d1c24c997a4666c10501613fbc8e975c948174cd06568ce1"} Apr 23 17:41:35.651427 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.651399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j6hxm" event={"ID":"4a1cf606-e60a-4909-8878-950353a863cc","Type":"ContainerStarted","Data":"da9bb1e5e17d2362b136c0dacdf0faaa840a912ee2ce7890e87c6b70aba3ff23"} Apr 23 17:41:35.654180 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.654156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v6skv" event={"ID":"2928f4e6-28bf-471f-bb81-513b3e161d32","Type":"ContainerStarted","Data":"028970e32d090780abae4f5787dbfb597cdde26a64276f6c9f6d68a6c63292d6"} Apr 23 17:41:35.654792 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.654764 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:36:33 +0000 UTC" deadline="2027-10-30 17:01:35.351351023 +0000 UTC" Apr 23 17:41:35.654851 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.654792 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13319h19m59.696562235s" Apr 23 17:41:35.657101 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.657080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" event={"ID":"9475ce23d467a37e0480df7597bbc574","Type":"ContainerStarted","Data":"287bdd666632b288acfc0ea5a071d98bc492e636ae6e8c22d452c5ba7271b82a"} Apr 23 17:41:35.658180 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.658153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4f55z" event={"ID":"19ad9566-830f-4ba3-bed2-db16fce5cd6a","Type":"ContainerStarted","Data":"b55a21775f371f0146899b8891bc511b21bb7359dad3d97ccdf2eda60accf66b"} Apr 23 17:41:35.660453 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.660428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kl27l" event={"ID":"3c193e30-8c0e-422b-be31-7daf50d7aeb1","Type":"ContainerStarted","Data":"8c601cb7db6d58488c9c4560e7e1aa46027731098c6f70d0c88b32eed0be5171"} Apr 23 17:41:35.662127 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.662103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9psv4" event={"ID":"c531f58f-450b-4518-9e3d-0be09c2473b9","Type":"ContainerStarted","Data":"7b67ddc184a1ef2fdec254c39fa9927b1c0461e55552ce7e6d73e4baac96a227"} Apr 23 17:41:35.665616 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.665583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" event={"ID":"303a91cd-3950-49c3-bba5-e1970d19eb67","Type":"ContainerStarted","Data":"6bb3daf9e39cd9333199f2f2732fd20583cb9df43244c904394ec4db620210f6"} Apr 23 17:41:35.667808 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.667787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerStarted","Data":"29e84cdbd8e91b286a5047b1ed5ebda009b369cbe4e62df8aa4e6160dd3c4091"} Apr 23 17:41:35.671018 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:35.670993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" event={"ID":"f1d4eda8-73aa-4336-8bc4-dd2b15195cac","Type":"ContainerStarted","Data":"fbeb14b6f1ac09c952f6143a68ebcc938032d94a07e6166d5f7584bff453f35d"} Apr 23 17:41:36.240174 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:36.240140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:36.240340 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:36.240270 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:36.240340 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:36.240329 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:38.24031073 +0000 UTC m=+6.170323349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:36.341336 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:36.341294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:36.341514 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:36.341479 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:36.341514 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:36.341498 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:36.341514 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:36.341513 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jjt8p for pod openshift-network-diagnostics/network-check-target-xtb9l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:36.341668 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:36.341572 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p podName:a73a83b1-557f-48ee-895c-c53fd945675b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:38.341554029 +0000 UTC m=+6.271566651 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jjt8p" (UniqueName: "kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p") pod "network-check-target-xtb9l" (UID: "a73a83b1-557f-48ee-895c-c53fd945675b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:36.643389 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:36.643354 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:36.643570 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:36.643488 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:36.691538 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:36.691499 2576 generic.go:358] "Generic (PLEG): container finished" podID="25dc2f6e99d2525192843ee005a28c4f" containerID="8bcbe57a6c1dc78bb6531801357a469f6d1bdca32ff8d950b10e669a721ed48c" exitCode=0 Apr 23 17:41:36.692063 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:36.691628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" event={"ID":"25dc2f6e99d2525192843ee005a28c4f","Type":"ContainerDied","Data":"8bcbe57a6c1dc78bb6531801357a469f6d1bdca32ff8d950b10e669a721ed48c"} Apr 23 17:41:36.709407 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:36.709352 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-68.ec2.internal" podStartSLOduration=2.709331422 podStartE2EDuration="2.709331422s" podCreationTimestamp="2026-04-23 17:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:41:35.671703812 +0000 UTC m=+3.601716449" watchObservedRunningTime="2026-04-23 17:41:36.709331422 +0000 UTC m=+4.639344059" Apr 23 17:41:37.643019 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:37.642958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:37.643223 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:37.643112 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:37.720028 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:37.719985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" event={"ID":"25dc2f6e99d2525192843ee005a28c4f","Type":"ContainerStarted","Data":"35898cb2f72fc75c9f1737b1565ad717392c233778522879a2188c04e0b0dd6d"} Apr 23 17:41:37.739359 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:37.738976 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-68.ec2.internal" podStartSLOduration=3.7389556859999997 podStartE2EDuration="3.738955686s" podCreationTimestamp="2026-04-23 17:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:41:37.737126906 +0000 UTC m=+5.667139614" watchObservedRunningTime="2026-04-23 17:41:37.738955686 +0000 UTC m=+5.668968323" Apr 23 17:41:38.256788 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:38.256749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:38.256980 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:38.256948 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:38.257487 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:38.257043 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:42.257022494 +0000 UTC m=+10.187035124 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:38.358197 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:38.358142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:38.358390 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:38.358343 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:38.358390 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:38.358362 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:38.358390 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:38.358375 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jjt8p for pod openshift-network-diagnostics/network-check-target-xtb9l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:38.358539 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:38.358435 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p podName:a73a83b1-557f-48ee-895c-c53fd945675b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:42.358414803 +0000 UTC m=+10.288427421 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jjt8p" (UniqueName: "kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p") pod "network-check-target-xtb9l" (UID: "a73a83b1-557f-48ee-895c-c53fd945675b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:38.644562 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:38.643495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:38.644562 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:38.643624 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:39.643019 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:39.642973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:39.643415 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:39.643126 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:40.643665 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:40.643634 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:40.644143 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:40.643791 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:41.642982 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:41.642950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:41.643192 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:41.643100 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:42.293177 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:42.293061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:42.293639 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:42.293241 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:42.293639 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:42.293318 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:41:50.293297842 +0000 UTC m=+18.223310476 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:42.393531 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:42.393489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:42.393796 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:42.393739 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:42.393796 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:42.393760 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:42.393796 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:42.393773 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jjt8p for pod openshift-network-diagnostics/network-check-target-xtb9l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:42.393972 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:42.393855 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p podName:a73a83b1-557f-48ee-895c-c53fd945675b nodeName:}" failed. No retries permitted until 2026-04-23 17:41:50.393818821 +0000 UTC m=+18.323831440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jjt8p" (UniqueName: "kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p") pod "network-check-target-xtb9l" (UID: "a73a83b1-557f-48ee-895c-c53fd945675b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:42.644670 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:42.644124 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:42.644670 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:42.644236 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:43.643101 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:43.643069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:43.643591 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:43.643216 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:44.643731 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:44.643677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:44.644171 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:44.643824 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:45.643283 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:45.643249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:45.643469 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:45.643374 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:46.643831 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:46.643795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:46.644252 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:46.643898 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:47.642803 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:47.642762 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:47.642947 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:47.642875 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:48.643412 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:48.643370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:48.643921 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:48.643508 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:49.643105 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:49.643056 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:49.643286 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:49.643209 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:50.345621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:50.345582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:50.346073 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:50.345746 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:50.346073 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:50.345821 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:06.345800493 +0000 UTC m=+34.275813106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:41:50.446736 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:50.446688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:50.446994 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:50.446856 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:41:50.446994 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:50.446879 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:41:50.446994 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:50.446906 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jjt8p for pod openshift-network-diagnostics/network-check-target-xtb9l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:50.446994 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:50.446963 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p podName:a73a83b1-557f-48ee-895c-c53fd945675b nodeName:}" failed. No retries permitted until 2026-04-23 17:42:06.446945371 +0000 UTC m=+34.376957999 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jjt8p" (UniqueName: "kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p") pod "network-check-target-xtb9l" (UID: "a73a83b1-557f-48ee-895c-c53fd945675b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:41:50.643652 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:50.643564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:50.643832 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:50.643704 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:51.642925 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:51.642888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:51.643396 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:51.643042 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:52.643735 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.643695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:52.644309 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:52.643817 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:52.746649 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.746621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:41:52.746978 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.746952 2576 generic.go:358] "Generic (PLEG): container finished" podID="4b659406-d1b9-4f3d-86f2-68515038c182" containerID="d089c95150a58af1b1676989d0465dbb9a92dcd83ba4f64f69641109bf51b33f" exitCode=1 Apr 23 17:41:52.747078 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.747014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"17f35e9d52cc20bcf11b04e3de5f7e6f40d742a9179c15cebf8c1218aa414161"} Apr 23 17:41:52.747078 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.747035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerDied","Data":"d089c95150a58af1b1676989d0465dbb9a92dcd83ba4f64f69641109bf51b33f"} Apr 23 17:41:52.747078 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.747051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"07e5d8070cc381618e96141f7054f740c3802996618ec4d6fdbf1ee89c2e65e3"} Apr 23 17:41:52.748434 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.748407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j6hxm" event={"ID":"4a1cf606-e60a-4909-8878-950353a863cc","Type":"ContainerStarted","Data":"9ff265415256e5ace0920a8c75add6f995dfd1c294efbeec21133df19ea9c9bf"} Apr 23 17:41:52.750132 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.750103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v6skv" event={"ID":"2928f4e6-28bf-471f-bb81-513b3e161d32","Type":"ContainerStarted","Data":"69cf9796269a1218d2da5a5a57310f53731d9cdf278bb8f0a1643de49bc2dca7"} Apr 23 17:41:52.751764 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.751735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4f55z" event={"ID":"19ad9566-830f-4ba3-bed2-db16fce5cd6a","Type":"ContainerStarted","Data":"6f028fe43c67832d306f445df2e3b5d60df6438f8eb1de3b5a3a71705fa8f8fe"} Apr 23 17:41:52.753144 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.753105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kl27l" event={"ID":"3c193e30-8c0e-422b-be31-7daf50d7aeb1","Type":"ContainerStarted","Data":"ff3527b4d32d5cb85168ab8e18ce027b7a30713ed92e63689307c8423ec0c5bb"} Apr 23 17:41:52.754375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.754357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" event={"ID":"303a91cd-3950-49c3-bba5-e1970d19eb67","Type":"ContainerStarted","Data":"5259988edafb16bf388e6416740fe64c06d6d9667be1d21d367e758a5f5e0504"} Apr 23 17:41:52.755829 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.755801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerStarted","Data":"dc986715711f0693e10057bcee497b3b15df676b308089f50c93270b44d4a930"} Apr 23 17:41:52.757215 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.757191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" event={"ID":"f1d4eda8-73aa-4336-8bc4-dd2b15195cac","Type":"ContainerStarted","Data":"6bca7b1c305af582e0d943bf041fc5b26fabcba69b5c1b728899c59ff375b5ba"} Apr 23 17:41:52.766516 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.766467 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j6hxm" podStartSLOduration=3.977755691 podStartE2EDuration="20.766452101s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.323810757 +0000 UTC m=+3.253823376" lastFinishedPulling="2026-04-23 17:41:52.112507159 +0000 UTC m=+20.042519786" observedRunningTime="2026-04-23 17:41:52.765413606 +0000 UTC m=+20.695426242" watchObservedRunningTime="2026-04-23 17:41:52.766452101 +0000 UTC m=+20.696464738" Apr 23 17:41:52.783241 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.783192 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v6skv" podStartSLOduration=3.989361101 podStartE2EDuration="20.783178753s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.318299965 +0000 UTC m=+3.248312579" lastFinishedPulling="2026-04-23 17:41:52.112117618 +0000 UTC m=+20.042130231" observedRunningTime="2026-04-23 17:41:52.783079942 +0000 UTC m=+20.713092581" watchObservedRunningTime="2026-04-23 17:41:52.783178753 +0000 UTC m=+20.713191388" Apr 23 17:41:52.825244 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.825201 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tfs7c" podStartSLOduration=4.033902688 podStartE2EDuration="20.825186294s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.32294382 +0000 UTC m=+3.252956434" lastFinishedPulling="2026-04-23 17:41:52.114227406 +0000 UTC m=+20.044240040" observedRunningTime="2026-04-23 17:41:52.825111147 +0000 UTC m=+20.755123784" watchObservedRunningTime="2026-04-23 17:41:52.825186294 +0000 UTC m=+20.755198929" Apr 23 17:41:52.840511 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.840471 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4f55z" podStartSLOduration=4.058863154 podStartE2EDuration="20.84045736s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.330518722 +0000 UTC m=+3.260531336" lastFinishedPulling="2026-04-23 17:41:52.112112925 +0000 UTC m=+20.042125542" observedRunningTime="2026-04-23 17:41:52.840429079 +0000 UTC m=+20.770441715" watchObservedRunningTime="2026-04-23 17:41:52.84045736 +0000 UTC m=+20.770469996" Apr 23 17:41:52.860476 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:52.860438 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kl27l" podStartSLOduration=4.039040344 podStartE2EDuration="20.860426148s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.328501426 +0000 UTC m=+3.258514055" lastFinishedPulling="2026-04-23 17:41:52.149887245 +0000 UTC m=+20.079899859" observedRunningTime="2026-04-23 17:41:52.860173967 +0000 UTC m=+20.790186599" watchObservedRunningTime="2026-04-23 17:41:52.860426148 +0000 UTC m=+20.790438784" Apr 23 17:41:53.643769 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.643737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:53.644269 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:53.643862 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:53.761422 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.761394 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:41:53.761848 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.761816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"f1e62365574f84d0485468977c1900497c90e9e0cf2be2e5a2d8f4aaff9fe250"} Apr 23 17:41:53.761971 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.761856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"36d368e9a4333d4a4241c4f62e187f042d5abf555561690f824f5af7a17e7a60"} Apr 23 17:41:53.761971 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.761870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"d8524058553ac3007163193fabe8233ffd502c7ceed1fd1b64d2e9d0e0183ee4"} Apr 23 17:41:53.763103 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.763082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9psv4" event={"ID":"c531f58f-450b-4518-9e3d-0be09c2473b9","Type":"ContainerStarted","Data":"40a2754d8dd4d67fab39da87b410f3248aee573af2f9117dae2f94fdb6f85ded"} Apr 23 17:41:53.764496 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.764475 2576 generic.go:358] "Generic (PLEG): container finished" podID="91e1f83d-4f6d-434e-b876-d8ab02848d17" containerID="dc986715711f0693e10057bcee497b3b15df676b308089f50c93270b44d4a930" exitCode=0 Apr 23 17:41:53.764590 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.764572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerDied","Data":"dc986715711f0693e10057bcee497b3b15df676b308089f50c93270b44d4a930"} Apr 23 17:41:53.786532 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.786442 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9psv4" podStartSLOduration=5.0012592 podStartE2EDuration="21.786426137s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.327183315 +0000 UTC m=+3.257195941" lastFinishedPulling="2026-04-23 17:41:52.112350254 +0000 UTC m=+20.042362878" observedRunningTime="2026-04-23 17:41:53.785421208 +0000 UTC m=+21.715433843" watchObservedRunningTime="2026-04-23 17:41:53.786426137 +0000 UTC m=+21.716438775" Apr 23 17:41:53.920370 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:53.920312 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:41:54.459863 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:54.459830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:54.590238 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:54.590080 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:41:53.920331769Z","UUID":"5b39cc48-9ec2-487c-afbf-7804b7da305b","Handler":null,"Name":"","Endpoint":""} Apr 23 17:41:54.592903 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:54.592878 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:41:54.592903 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:54.592910 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:41:54.643810 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:54.643630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:54.643810 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:54.643771 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:54.768856 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:54.768768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" event={"ID":"f1d4eda8-73aa-4336-8bc4-dd2b15195cac","Type":"ContainerStarted","Data":"a73ce864cd1f354c4019b198d14d41b45526824f4c61e017580395aaf445e80e"} Apr 23 17:41:55.643477 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:55.643291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:55.643631 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:55.643576 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:56.209758 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:56.209696 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:56.210515 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:56.210494 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:56.643036 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:56.643005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:56.643225 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:56.643110 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:56.772156 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:56.772130 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j6hxm" Apr 23 17:41:57.643651 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:57.643619 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:57.644205 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:57.643754 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:57.774323 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:57.774288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" event={"ID":"f1d4eda8-73aa-4336-8bc4-dd2b15195cac","Type":"ContainerStarted","Data":"b2cdfb31d972d7c4538a60025af5adfe8d77a809791cd201ea1ce3a3318b443b"} Apr 23 17:41:57.776767 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:57.776746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:41:57.777089 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:57.777059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"53f03deaf8be9a5709db96efff6358ba267c40fb64ba20fb46a5ee79a0e0db34"} Apr 23 17:41:57.778476 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:57.778447 2576 generic.go:358] "Generic (PLEG): container finished" podID="91e1f83d-4f6d-434e-b876-d8ab02848d17" containerID="eaa1ee9e99511889f3fcc717956d04b9b74691399d87f8305798f204ce3bc0d2" exitCode=0 Apr 23 17:41:57.778565 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:57.778530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerDied","Data":"eaa1ee9e99511889f3fcc717956d04b9b74691399d87f8305798f204ce3bc0d2"} Apr 23 17:41:57.839001 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:57.838949 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v7r4" podStartSLOduration=4.383814534 podStartE2EDuration="25.838935427s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.330454182 +0000 UTC m=+3.260466796" lastFinishedPulling="2026-04-23 17:41:56.785575072 +0000 UTC m=+24.715587689" observedRunningTime="2026-04-23 17:41:57.800897462 +0000 UTC m=+25.730910108" watchObservedRunningTime="2026-04-23 17:41:57.838935427 +0000 UTC m=+25.768948093" Apr 23 17:41:58.643524 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:58.643494 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:41:58.643661 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:58.643623 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:41:58.782029 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:58.781945 2576 generic.go:358] "Generic (PLEG): container finished" podID="91e1f83d-4f6d-434e-b876-d8ab02848d17" containerID="7bd43f1591ebcfdab65034d8b5ae25a83e2b74a02f8e0fc72e1762c60db07f63" exitCode=0 Apr 23 17:41:58.782029 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:58.782016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerDied","Data":"7bd43f1591ebcfdab65034d8b5ae25a83e2b74a02f8e0fc72e1762c60db07f63"} Apr 23 17:41:59.643129 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.642880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:41:59.643283 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:41:59.643159 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:41:59.785915 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.785882 2576 generic.go:358] "Generic (PLEG): container finished" podID="91e1f83d-4f6d-434e-b876-d8ab02848d17" containerID="0dda39c8392d29ba26e20831385962c6ddbe8949d09f138c04e195b4c69502c8" exitCode=0 Apr 23 17:41:59.786327 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.785966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerDied","Data":"0dda39c8392d29ba26e20831385962c6ddbe8949d09f138c04e195b4c69502c8"} Apr 23 17:41:59.789057 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.789038 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:41:59.789354 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.789335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"3fefb04f7dac415255a36a870769c1d1ebca4f88c7e49f346d8857c7252b4aa3"} Apr 23 17:41:59.789644 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.789621 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:59.789736 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.789661 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:59.789736 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.789674 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:59.789883 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.789869 2576 scope.go:117] "RemoveContainer" containerID="d089c95150a58af1b1676989d0465dbb9a92dcd83ba4f64f69641109bf51b33f" Apr 23 17:41:59.805851 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.805820 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:41:59.805984 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:41:59.805898 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:42:00.643375 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:00.643051 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:00.643375 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:00.643196 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:42:00.796346 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:00.796318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:42:00.796794 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:00.796746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" event={"ID":"4b659406-d1b9-4f3d-86f2-68515038c182","Type":"ContainerStarted","Data":"037872cf3a4ca941b3386ea49682491491e81d0bec579e52b0b7fb82f8554a86"} Apr 23 17:42:00.832349 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:00.832293 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" podStartSLOduration=11.763860704 podStartE2EDuration="28.832279051s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.324976039 +0000 UTC m=+3.254988659" lastFinishedPulling="2026-04-23 17:41:52.393394375 +0000 UTC m=+20.323407006" observedRunningTime="2026-04-23 17:42:00.831925089 +0000 UTC m=+28.761937724" watchObservedRunningTime="2026-04-23 17:42:00.832279051 +0000 UTC m=+28.762291686" Apr 23 17:42:00.870845 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:00.870266 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xtb9l"] Apr 23 17:42:00.870845 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:00.870373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:00.870845 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:00.870474 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:42:00.870845 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:00.870704 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vjjmx"] Apr 23 17:42:00.870845 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:00.870832 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:42:00.871233 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:00.870929 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:42:02.646378 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:02.646199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:42:02.646781 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:02.646198 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:02.646781 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:02.646449 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:42:02.646781 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:02.646504 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:42:04.645737 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:04.645695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:04.646195 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:04.645699 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:42:04.646195 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:04.645805 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xtb9l" podUID="a73a83b1-557f-48ee-895c-c53fd945675b" Apr 23 17:42:04.646195 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:04.645906 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:42:05.435472 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.435441 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-68.ec2.internal" event="NodeReady" Apr 23 17:42:05.435803 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.435574 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:42:05.485943 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.485908 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fxb9d"] Apr 23 17:42:05.490905 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.490874 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z8pdq"] Apr 23 17:42:05.491091 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.491039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.493830 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.493810 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:05.494598 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.494580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:42:05.495169 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.495147 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zw6qq\"" Apr 23 17:42:05.495294 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.495150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:42:05.496978 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.496960 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:42:05.497191 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.497174 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:42:05.497273 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.497175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-576jd\"" Apr 23 17:42:05.497759 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.497708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:42:05.502195 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.502172 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fxb9d"] Apr 23 17:42:05.506303 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.506280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z8pdq"] Apr 23 17:42:05.562587 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.562547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fad7629-2a8f-44c8-8668-437d00f77bca-config-volume\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.562782 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.562683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.562782 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.562745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9fad7629-2a8f-44c8-8668-437d00f77bca-tmp-dir\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.562882 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.562780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhdxs\" (UniqueName: \"kubernetes.io/projected/9fad7629-2a8f-44c8-8668-437d00f77bca-kube-api-access-vhdxs\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.663084 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.663043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fad7629-2a8f-44c8-8668-437d00f77bca-config-volume\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.663084 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.663091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:05.663757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.663158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.663757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.663192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9fad7629-2a8f-44c8-8668-437d00f77bca-tmp-dir\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.663757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.663217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8tzh\" (UniqueName: \"kubernetes.io/projected/1714203d-7df7-4a8f-8d58-69bc1d7062f4-kube-api-access-x8tzh\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:05.663757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.663243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhdxs\" (UniqueName: \"kubernetes.io/projected/9fad7629-2a8f-44c8-8668-437d00f77bca-kube-api-access-vhdxs\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.663757 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:05.663314 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:05.663757 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:05.663398 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls podName:9fad7629-2a8f-44c8-8668-437d00f77bca nodeName:}" failed. No retries permitted until 2026-04-23 17:42:06.163380708 +0000 UTC m=+34.093393322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls") pod "dns-default-fxb9d" (UID: "9fad7629-2a8f-44c8-8668-437d00f77bca") : secret "dns-default-metrics-tls" not found Apr 23 17:42:05.663757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.663596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9fad7629-2a8f-44c8-8668-437d00f77bca-tmp-dir\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.663757 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.663744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9fad7629-2a8f-44c8-8668-437d00f77bca-config-volume\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.675201 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.675175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhdxs\" (UniqueName: \"kubernetes.io/projected/9fad7629-2a8f-44c8-8668-437d00f77bca-kube-api-access-vhdxs\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:05.764289 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.764206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:05.764446 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.764310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8tzh\" (UniqueName: \"kubernetes.io/projected/1714203d-7df7-4a8f-8d58-69bc1d7062f4-kube-api-access-x8tzh\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:05.764446 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:05.764390 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:05.764534 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:05.764470 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert podName:1714203d-7df7-4a8f-8d58-69bc1d7062f4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:06.264452194 +0000 UTC m=+34.194464813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert") pod "ingress-canary-z8pdq" (UID: "1714203d-7df7-4a8f-8d58-69bc1d7062f4") : secret "canary-serving-cert" not found Apr 23 17:42:05.776907 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:05.776875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8tzh\" (UniqueName: \"kubernetes.io/projected/1714203d-7df7-4a8f-8d58-69bc1d7062f4-kube-api-access-x8tzh\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:06.167673 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.167625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:06.167926 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.167809 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:06.167926 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.167904 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls podName:9fad7629-2a8f-44c8-8668-437d00f77bca nodeName:}" failed. No retries permitted until 2026-04-23 17:42:07.167884968 +0000 UTC m=+35.097897604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls") pod "dns-default-fxb9d" (UID: "9fad7629-2a8f-44c8-8668-437d00f77bca") : secret "dns-default-metrics-tls" not found Apr 23 17:42:06.268592 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.268552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:06.268797 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.268690 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:06.268797 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.268783 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert podName:1714203d-7df7-4a8f-8d58-69bc1d7062f4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:07.268762513 +0000 UTC m=+35.198775157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert") pod "ingress-canary-z8pdq" (UID: "1714203d-7df7-4a8f-8d58-69bc1d7062f4") : secret "canary-serving-cert" not found Apr 23 17:42:06.369682 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.369637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:42:06.369862 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.369768 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:42:06.369920 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.369874 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:38.369853951 +0000 UTC m=+66.299866584 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:42:06.470263 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.470175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:06.470411 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.470357 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:42:06.470411 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.470377 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:42:06.470411 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.470389 2576 projected.go:194] Error preparing data for projected volume kube-api-access-jjt8p for pod openshift-network-diagnostics/network-check-target-xtb9l: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:42:06.470544 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:06.470454 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p podName:a73a83b1-557f-48ee-895c-c53fd945675b nodeName:}" failed. No retries permitted until 2026-04-23 17:42:38.470434674 +0000 UTC m=+66.400447311 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jjt8p" (UniqueName: "kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p") pod "network-check-target-xtb9l" (UID: "a73a83b1-557f-48ee-895c-c53fd945675b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:42:06.643384 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.643347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:42:06.643384 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.643378 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:06.647081 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.647043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:42:06.647220 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.647083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:42:06.647220 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.647083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:42:06.648217 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.648199 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdbsc\"" Apr 23 17:42:06.648535 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:06.648521 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gwf2x\"" Apr 23 17:42:07.176455 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:07.176419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:07.177141 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:07.176575 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:07.177141 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:07.176662 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls podName:9fad7629-2a8f-44c8-8668-437d00f77bca nodeName:}" failed. No retries permitted until 2026-04-23 17:42:09.17663051 +0000 UTC m=+37.106643144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls") pod "dns-default-fxb9d" (UID: "9fad7629-2a8f-44c8-8668-437d00f77bca") : secret "dns-default-metrics-tls" not found Apr 23 17:42:07.276847 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:07.276807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:07.277036 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:07.276945 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:07.277036 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:07.277011 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert podName:1714203d-7df7-4a8f-8d58-69bc1d7062f4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:09.276991244 +0000 UTC m=+37.207003872 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert") pod "ingress-canary-z8pdq" (UID: "1714203d-7df7-4a8f-8d58-69bc1d7062f4") : secret "canary-serving-cert" not found Apr 23 17:42:08.814450 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:08.814409 2576 generic.go:358] "Generic (PLEG): container finished" podID="91e1f83d-4f6d-434e-b876-d8ab02848d17" containerID="5aa1e69f894eaea46f18907351d17d03373f439a89ce7e46a0eebc04c5b8754b" exitCode=0 Apr 23 17:42:08.814450 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:08.814459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerDied","Data":"5aa1e69f894eaea46f18907351d17d03373f439a89ce7e46a0eebc04c5b8754b"} Apr 23 17:42:09.191929 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:09.191830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:09.192060 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:09.191979 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:09.192060 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:09.192046 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls podName:9fad7629-2a8f-44c8-8668-437d00f77bca nodeName:}" failed. No retries permitted until 2026-04-23 17:42:13.192030959 +0000 UTC m=+41.122043573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls") pod "dns-default-fxb9d" (UID: "9fad7629-2a8f-44c8-8668-437d00f77bca") : secret "dns-default-metrics-tls" not found Apr 23 17:42:09.293168 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:09.293139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:09.293311 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:09.293259 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:09.293311 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:09.293305 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert podName:1714203d-7df7-4a8f-8d58-69bc1d7062f4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:13.293292123 +0000 UTC m=+41.223304738 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert") pod "ingress-canary-z8pdq" (UID: "1714203d-7df7-4a8f-8d58-69bc1d7062f4") : secret "canary-serving-cert" not found Apr 23 17:42:09.818767 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:09.818736 2576 generic.go:358] "Generic (PLEG): container finished" podID="91e1f83d-4f6d-434e-b876-d8ab02848d17" containerID="c755e2b5715e5c9801ac425e67242263c49d725a691735ec47b509839a6319ee" exitCode=0 Apr 23 17:42:09.819138 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:09.818789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerDied","Data":"c755e2b5715e5c9801ac425e67242263c49d725a691735ec47b509839a6319ee"} Apr 23 17:42:10.823552 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:10.823362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" event={"ID":"91e1f83d-4f6d-434e-b876-d8ab02848d17","Type":"ContainerStarted","Data":"8ac92886a363c14a19ae28f9f8529c2192dd0515738fdd6e1c1532c8606adf08"} Apr 23 17:42:10.850099 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:10.850002 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5s6bg" podStartSLOduration=6.215186063 podStartE2EDuration="38.849982713s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:41:35.31948586 +0000 UTC m=+3.249498474" lastFinishedPulling="2026-04-23 17:42:07.954282494 +0000 UTC m=+35.884295124" observedRunningTime="2026-04-23 17:42:10.848543016 +0000 UTC m=+38.778555652" watchObservedRunningTime="2026-04-23 17:42:10.849982713 +0000 UTC m=+38.779995350" Apr 23 17:42:13.223082 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:13.223028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:13.223492 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:13.223180 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:13.223492 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:13.223243 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls podName:9fad7629-2a8f-44c8-8668-437d00f77bca nodeName:}" failed. No retries permitted until 2026-04-23 17:42:21.223226505 +0000 UTC m=+49.153239119 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls") pod "dns-default-fxb9d" (UID: "9fad7629-2a8f-44c8-8668-437d00f77bca") : secret "dns-default-metrics-tls" not found Apr 23 17:42:13.323689 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:13.323642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:13.323897 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:13.323835 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:13.323960 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:13.323913 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert podName:1714203d-7df7-4a8f-8d58-69bc1d7062f4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:21.32389238 +0000 UTC m=+49.253905008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert") pod "ingress-canary-z8pdq" (UID: "1714203d-7df7-4a8f-8d58-69bc1d7062f4") : secret "canary-serving-cert" not found Apr 23 17:42:21.282614 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:21.282558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:21.283060 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:21.282741 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:21.283060 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:21.282809 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls podName:9fad7629-2a8f-44c8-8668-437d00f77bca nodeName:}" failed. No retries permitted until 2026-04-23 17:42:37.282792413 +0000 UTC m=+65.212805026 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls") pod "dns-default-fxb9d" (UID: "9fad7629-2a8f-44c8-8668-437d00f77bca") : secret "dns-default-metrics-tls" not found Apr 23 17:42:21.383772 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:21.383713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:21.383959 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:21.383859 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:21.383959 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:21.383943 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert podName:1714203d-7df7-4a8f-8d58-69bc1d7062f4 nodeName:}" failed. No retries permitted until 2026-04-23 17:42:37.383925509 +0000 UTC m=+65.313938123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert") pod "ingress-canary-z8pdq" (UID: "1714203d-7df7-4a8f-8d58-69bc1d7062f4") : secret "canary-serving-cert" not found Apr 23 17:42:31.813165 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:31.813136 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-swcqx" Apr 23 17:42:37.285078 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:37.285031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:42:37.285572 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:37.285190 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:42:37.285572 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:37.285256 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls podName:9fad7629-2a8f-44c8-8668-437d00f77bca nodeName:}" failed. No retries permitted until 2026-04-23 17:43:09.285241982 +0000 UTC m=+97.215254596 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls") pod "dns-default-fxb9d" (UID: "9fad7629-2a8f-44c8-8668-437d00f77bca") : secret "dns-default-metrics-tls" not found Apr 23 17:42:37.385401 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:37.385355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:42:37.385499 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:37.385470 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:42:37.385544 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:37.385517 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert podName:1714203d-7df7-4a8f-8d58-69bc1d7062f4 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:09.385505429 +0000 UTC m=+97.315518043 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert") pod "ingress-canary-z8pdq" (UID: "1714203d-7df7-4a8f-8d58-69bc1d7062f4") : secret "canary-serving-cert" not found Apr 23 17:42:38.391013 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.390958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:42:38.393927 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.393903 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:42:38.401338 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:38.401319 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:42:38.401408 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:42:38.401398 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:42.401383854 +0000 UTC m=+130.331396467 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : secret "metrics-daemon-secret" not found Apr 23 17:42:38.492056 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.492020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:38.495211 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.495192 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:42:38.505588 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.505565 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:42:38.516276 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.516248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjt8p\" (UniqueName: \"kubernetes.io/projected/a73a83b1-557f-48ee-895c-c53fd945675b-kube-api-access-jjt8p\") pod \"network-check-target-xtb9l\" (UID: \"a73a83b1-557f-48ee-895c-c53fd945675b\") " pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:38.763389 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.763312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gwf2x\"" Apr 23 17:42:38.770859 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.770837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:38.895173 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:38.895139 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xtb9l"] Apr 23 17:42:38.899093 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:42:38.899067 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda73a83b1_557f_48ee_895c_c53fd945675b.slice/crio-f81214b9985e82a268de7d9d0bf53c039d2947fad660c629726aa99c60cf9e83 WatchSource:0}: Error finding container f81214b9985e82a268de7d9d0bf53c039d2947fad660c629726aa99c60cf9e83: Status 404 returned error can't find the container with id f81214b9985e82a268de7d9d0bf53c039d2947fad660c629726aa99c60cf9e83 Apr 23 17:42:39.880055 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:39.880019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xtb9l" event={"ID":"a73a83b1-557f-48ee-895c-c53fd945675b","Type":"ContainerStarted","Data":"f81214b9985e82a268de7d9d0bf53c039d2947fad660c629726aa99c60cf9e83"} Apr 23 17:42:41.884694 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:41.884660 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xtb9l" event={"ID":"a73a83b1-557f-48ee-895c-c53fd945675b","Type":"ContainerStarted","Data":"b18700a27f931d5e18508e2244559e77696466a39b4b971af1348a2646227482"} Apr 23 17:42:41.885135 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:41.884788 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:42:41.902056 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:42:41.902013 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xtb9l" podStartSLOduration=67.379346835 podStartE2EDuration="1m9.902001315s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:42:38.900833187 +0000 UTC m=+66.830845800" lastFinishedPulling="2026-04-23 17:42:41.423487652 +0000 UTC m=+69.353500280" observedRunningTime="2026-04-23 17:42:41.90154411 +0000 UTC m=+69.831556746" watchObservedRunningTime="2026-04-23 17:42:41.902001315 +0000 UTC m=+69.832013942" Apr 23 17:43:09.300550 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:09.300424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:43:09.300550 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:09.300526 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:43:09.301097 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:09.300594 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls podName:9fad7629-2a8f-44c8-8668-437d00f77bca nodeName:}" failed. No retries permitted until 2026-04-23 17:44:13.30057843 +0000 UTC m=+161.230591044 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls") pod "dns-default-fxb9d" (UID: "9fad7629-2a8f-44c8-8668-437d00f77bca") : secret "dns-default-metrics-tls" not found Apr 23 17:43:09.401371 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:09.401325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:43:09.401555 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:09.401452 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:43:09.401555 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:09.401528 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert podName:1714203d-7df7-4a8f-8d58-69bc1d7062f4 nodeName:}" failed. No retries permitted until 2026-04-23 17:44:13.401510507 +0000 UTC m=+161.331523121 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert") pod "ingress-canary-z8pdq" (UID: "1714203d-7df7-4a8f-8d58-69bc1d7062f4") : secret "canary-serving-cert" not found Apr 23 17:43:12.889532 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:12.889495 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xtb9l" Apr 23 17:43:27.464049 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.464012 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2kw6v"] Apr 23 17:43:27.487249 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.487217 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2kw6v"] Apr 23 17:43:27.487407 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.487364 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.490884 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.490859 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:43:27.625590 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.625546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0279be05-6d6b-46d3-9a5d-97af3972be80-kubelet-config\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.625590 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.625587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0279be05-6d6b-46d3-9a5d-97af3972be80-original-pull-secret\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.625944 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.625676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0279be05-6d6b-46d3-9a5d-97af3972be80-dbus\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.726248 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.726170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0279be05-6d6b-46d3-9a5d-97af3972be80-dbus\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.726248 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.726223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0279be05-6d6b-46d3-9a5d-97af3972be80-kubelet-config\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.726398 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.726290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0279be05-6d6b-46d3-9a5d-97af3972be80-kubelet-config\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.726398 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.726345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0279be05-6d6b-46d3-9a5d-97af3972be80-original-pull-secret\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.726398 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.726376 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0279be05-6d6b-46d3-9a5d-97af3972be80-dbus\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.728638 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.728619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0279be05-6d6b-46d3-9a5d-97af3972be80-original-pull-secret\") pod \"global-pull-secret-syncer-2kw6v\" (UID: \"0279be05-6d6b-46d3-9a5d-97af3972be80\") " pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.796932 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.796893 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2kw6v" Apr 23 17:43:27.920615 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.920574 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2kw6v"] Apr 23 17:43:27.925825 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:43:27.925794 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0279be05_6d6b_46d3_9a5d_97af3972be80.slice/crio-af7cc2bb907b4eab6e2f4377422318fefc2dc7100114fbc9c8893fa2878a9ec6 WatchSource:0}: Error finding container af7cc2bb907b4eab6e2f4377422318fefc2dc7100114fbc9c8893fa2878a9ec6: Status 404 returned error can't find the container with id af7cc2bb907b4eab6e2f4377422318fefc2dc7100114fbc9c8893fa2878a9ec6 Apr 23 17:43:27.966823 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:27.966782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2kw6v" event={"ID":"0279be05-6d6b-46d3-9a5d-97af3972be80","Type":"ContainerStarted","Data":"af7cc2bb907b4eab6e2f4377422318fefc2dc7100114fbc9c8893fa2878a9ec6"} Apr 23 17:43:31.975834 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:31.975790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2kw6v" event={"ID":"0279be05-6d6b-46d3-9a5d-97af3972be80","Type":"ContainerStarted","Data":"3dccb26ea8021cff2244caad7d0d53d36747822f4be44da2fe6970c0f314c496"} Apr 23 17:43:31.996884 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:31.996829 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2kw6v" podStartSLOduration=1.535539856 podStartE2EDuration="4.996788865s" podCreationTimestamp="2026-04-23 17:43:27 +0000 UTC" firstStartedPulling="2026-04-23 17:43:27.927888977 +0000 UTC m=+115.857901594" lastFinishedPulling="2026-04-23 17:43:31.389137989 +0000 UTC m=+119.319150603" observedRunningTime="2026-04-23 17:43:31.995961981 +0000 UTC m=+119.925974617" watchObservedRunningTime="2026-04-23 17:43:31.996788865 +0000 UTC m=+119.926801561" Apr 23 17:43:40.589814 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.589776 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5nzks"] Apr 23 17:43:40.592836 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.592810 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8"] Apr 23 17:43:40.592980 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.592959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.595560 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.595540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:40.595967 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.595949 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 17:43:40.595967 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.595959 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 17:43:40.596749 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.596713 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:43:40.596935 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.596912 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bzdjd\"" Apr 23 17:43:40.600687 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.600667 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 17:43:40.600814 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.600707 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 17:43:40.600814 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.600771 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:43:40.600895 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.600835 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8xk82\"" Apr 23 17:43:40.602507 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.602493 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 17:43:40.611038 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.611016 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 17:43:40.615383 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.615359 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5nzks"] Apr 23 17:43:40.618336 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.618313 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8"] Apr 23 17:43:40.727762 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.727704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-trusted-ca\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.727954 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.727767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slxgz\" (UniqueName: \"kubernetes.io/projected/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-kube-api-access-slxgz\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.727954 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.727790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:40.727954 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.727826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8wpd\" (UniqueName: \"kubernetes.io/projected/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-kube-api-access-f8wpd\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:40.727954 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.727906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-serving-cert\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.727954 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.727950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-config\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.828801 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.828761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-config\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.828989 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.828857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-trusted-ca\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.828989 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.828886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slxgz\" (UniqueName: \"kubernetes.io/projected/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-kube-api-access-slxgz\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.828989 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.828908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:40.828989 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.828938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8wpd\" (UniqueName: \"kubernetes.io/projected/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-kube-api-access-f8wpd\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:40.828989 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.828977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-serving-cert\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.829235 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:40.829032 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:43:40.829235 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:40.829107 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls podName:a57dc3a2-8d1a-4796-b085-1a4dcb8cd875 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:41.329090302 +0000 UTC m=+129.259102919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vdnk8" (UID: "a57dc3a2-8d1a-4796-b085-1a4dcb8cd875") : secret "samples-operator-tls" not found Apr 23 17:43:40.829661 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.829637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-config\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.829741 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.829637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-trusted-ca\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.831307 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.831287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-serving-cert\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.842831 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.842776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8wpd\" (UniqueName: \"kubernetes.io/projected/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-kube-api-access-f8wpd\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:40.843207 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.843184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slxgz\" (UniqueName: \"kubernetes.io/projected/79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8-kube-api-access-slxgz\") pod \"console-operator-9d4b6777b-5nzks\" (UID: \"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8\") " pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:40.902553 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:40.902525 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:41.019778 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.019746 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5nzks"] Apr 23 17:43:41.022612 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:43:41.022581 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f1dda8_3f2e_4cdb_99aa_0b76c1a17dd8.slice/crio-55a4ea04ff9464edf9940c6d9d7cdb610ce2c95e7d5c49904783c30aa70aba45 WatchSource:0}: Error finding container 55a4ea04ff9464edf9940c6d9d7cdb610ce2c95e7d5c49904783c30aa70aba45: Status 404 returned error can't find the container with id 55a4ea04ff9464edf9940c6d9d7cdb610ce2c95e7d5c49904783c30aa70aba45 Apr 23 17:43:41.332335 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.332302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:41.332546 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:41.332480 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:43:41.332615 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:41.332564 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls podName:a57dc3a2-8d1a-4796-b085-1a4dcb8cd875 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:42.332541115 +0000 UTC m=+130.262553743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vdnk8" (UID: "a57dc3a2-8d1a-4796-b085-1a4dcb8cd875") : secret "samples-operator-tls" not found Apr 23 17:43:41.472304 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.472237 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-684c867d65-lhpfq"] Apr 23 17:43:41.475313 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.475293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.486941 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.486887 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:43:41.486941 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.486901 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:43:41.487118 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.486989 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9m6lq\"" Apr 23 17:43:41.487274 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.487244 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:43:41.493279 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.493258 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:43:41.503775 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.503728 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-684c867d65-lhpfq"] Apr 23 17:43:41.636206 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.636109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.636206 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.636157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-trusted-ca\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.636206 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.636186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-image-registry-private-configuration\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.636691 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.636211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-bound-sa-token\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.636691 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.636283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-certificates\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.636691 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.636321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2fj\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-kube-api-access-rf2fj\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.636691 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.636369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-ca-trust-extracted\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.636691 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.636409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-installation-pull-secrets\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.737708 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.737671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.737708 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.737709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-trusted-ca\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.737944 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.737743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-image-registry-private-configuration\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.737944 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.737762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-bound-sa-token\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.737944 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.737823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-certificates\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.737944 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:41.737848 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:41.737944 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:41.737885 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684c867d65-lhpfq: secret "image-registry-tls" not found Apr 23 17:43:41.738115 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:41.737949 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls podName:0c7736d1-09e9-4b6a-a2e0-a5675df75c37 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:42.23792428 +0000 UTC m=+130.167936909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls") pod "image-registry-684c867d65-lhpfq" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37") : secret "image-registry-tls" not found Apr 23 17:43:41.738115 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.737855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2fj\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-kube-api-access-rf2fj\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.738115 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.738089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-ca-trust-extracted\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.738254 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.738129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-installation-pull-secrets\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.738532 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.738486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-ca-trust-extracted\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.738657 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.738549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-certificates\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.738916 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.738891 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-trusted-ca\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.740949 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.740922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-installation-pull-secrets\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.741040 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.740966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-image-registry-private-configuration\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.749735 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.749692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2fj\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-kube-api-access-rf2fj\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.749998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.749970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-bound-sa-token\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:41.995391 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:41.995320 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" event={"ID":"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8","Type":"ContainerStarted","Data":"55a4ea04ff9464edf9940c6d9d7cdb610ce2c95e7d5c49904783c30aa70aba45"} Apr 23 17:43:42.242642 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:42.242611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:42.242836 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:42.242784 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:42.242836 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:42.242803 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684c867d65-lhpfq: secret "image-registry-tls" not found Apr 23 17:43:42.242953 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:42.242869 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls podName:0c7736d1-09e9-4b6a-a2e0-a5675df75c37 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:43.242847937 +0000 UTC m=+131.172860552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls") pod "image-registry-684c867d65-lhpfq" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37") : secret "image-registry-tls" not found Apr 23 17:43:42.343567 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:42.343531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:42.343764 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:42.343693 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:43:42.343824 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:42.343781 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls podName:a57dc3a2-8d1a-4796-b085-1a4dcb8cd875 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:44.343764789 +0000 UTC m=+132.273777411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vdnk8" (UID: "a57dc3a2-8d1a-4796-b085-1a4dcb8cd875") : secret "samples-operator-tls" not found Apr 23 17:43:42.444795 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:42.444749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:43:42.444978 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:42.444904 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:43:42.445039 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:42.444984 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs podName:1b961de5-fea1-4bac-9c17-d8682d9a4242 nodeName:}" failed. No retries permitted until 2026-04-23 17:45:44.444963272 +0000 UTC m=+252.374975903 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs") pod "network-metrics-daemon-vjjmx" (UID: "1b961de5-fea1-4bac-9c17-d8682d9a4242") : secret "metrics-daemon-secret" not found Apr 23 17:43:42.998703 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:42.998632 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/0.log" Apr 23 17:43:42.998703 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:42.998672 2576 generic.go:358] "Generic (PLEG): container finished" podID="79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8" containerID="d49fea74e5c71c787f1abfcad5b96c54dbc6c37f6a68940a48333e9e917294e5" exitCode=255 Apr 23 17:43:42.999136 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:42.998704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" event={"ID":"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8","Type":"ContainerDied","Data":"d49fea74e5c71c787f1abfcad5b96c54dbc6c37f6a68940a48333e9e917294e5"} Apr 23 17:43:42.999136 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:42.998990 2576 scope.go:117] "RemoveContainer" containerID="d49fea74e5c71c787f1abfcad5b96c54dbc6c37f6a68940a48333e9e917294e5" Apr 23 17:43:43.253883 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.253789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:43.254018 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:43.253934 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:43.254018 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:43.253952 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684c867d65-lhpfq: secret "image-registry-tls" not found Apr 23 17:43:43.254018 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:43.254013 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls podName:0c7736d1-09e9-4b6a-a2e0-a5675df75c37 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:45.253997802 +0000 UTC m=+133.184010416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls") pod "image-registry-684c867d65-lhpfq" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37") : secret "image-registry-tls" not found Apr 23 17:43:43.371596 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.371566 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l"] Apr 23 17:43:43.374387 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.374365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.377376 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.377353 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 17:43:43.377518 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.377405 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:43:43.377518 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.377473 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-8pwvn\"" Apr 23 17:43:43.377631 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.377559 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 17:43:43.377874 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.377857 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 17:43:43.385027 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.385006 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l"] Apr 23 17:43:43.455313 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.455277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1174d10-c6be-499b-bba1-9efb0ba75fac-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.455313 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.455313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jxq9\" (UniqueName: \"kubernetes.io/projected/c1174d10-c6be-499b-bba1-9efb0ba75fac-kube-api-access-2jxq9\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.455484 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.455353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1174d10-c6be-499b-bba1-9efb0ba75fac-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.555752 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.555675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1174d10-c6be-499b-bba1-9efb0ba75fac-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.555752 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.555710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jxq9\" (UniqueName: \"kubernetes.io/projected/c1174d10-c6be-499b-bba1-9efb0ba75fac-kube-api-access-2jxq9\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.555858 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.555761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1174d10-c6be-499b-bba1-9efb0ba75fac-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.556212 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.556192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1174d10-c6be-499b-bba1-9efb0ba75fac-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.558033 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.558012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1174d10-c6be-499b-bba1-9efb0ba75fac-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.564378 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.564357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jxq9\" (UniqueName: \"kubernetes.io/projected/c1174d10-c6be-499b-bba1-9efb0ba75fac-kube-api-access-2jxq9\") pod \"kube-storage-version-migrator-operator-6769c5d45-6nd9l\" (UID: \"c1174d10-c6be-499b-bba1-9efb0ba75fac\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.682573 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.682528 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" Apr 23 17:43:43.794630 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:43.794601 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l"] Apr 23 17:43:43.797378 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:43:43.797351 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1174d10_c6be_499b_bba1_9efb0ba75fac.slice/crio-23be57d8e610c14d50ae8bcdc1f8f95cf6b140edbcef0574ded4a3a6054e82a4 WatchSource:0}: Error finding container 23be57d8e610c14d50ae8bcdc1f8f95cf6b140edbcef0574ded4a3a6054e82a4: Status 404 returned error can't find the container with id 23be57d8e610c14d50ae8bcdc1f8f95cf6b140edbcef0574ded4a3a6054e82a4 Apr 23 17:43:44.001917 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:44.001881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" event={"ID":"c1174d10-c6be-499b-bba1-9efb0ba75fac","Type":"ContainerStarted","Data":"23be57d8e610c14d50ae8bcdc1f8f95cf6b140edbcef0574ded4a3a6054e82a4"} Apr 23 17:43:44.003222 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:44.003199 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:43:44.003589 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:44.003575 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/0.log" Apr 23 17:43:44.003651 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:44.003607 2576 generic.go:358] "Generic (PLEG): container finished" podID="79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8" containerID="d8af809606705a0d0bf47fdb6bcaaa8cd0d0b58eaff4bc98e4d6e17e58aa3c33" exitCode=255 Apr 23 17:43:44.003651 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:44.003634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" event={"ID":"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8","Type":"ContainerDied","Data":"d8af809606705a0d0bf47fdb6bcaaa8cd0d0b58eaff4bc98e4d6e17e58aa3c33"} Apr 23 17:43:44.003778 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:44.003679 2576 scope.go:117] "RemoveContainer" containerID="d49fea74e5c71c787f1abfcad5b96c54dbc6c37f6a68940a48333e9e917294e5" Apr 23 17:43:44.003937 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:44.003919 2576 scope.go:117] "RemoveContainer" containerID="d8af809606705a0d0bf47fdb6bcaaa8cd0d0b58eaff4bc98e4d6e17e58aa3c33" Apr 23 17:43:44.004125 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:44.004108 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5nzks_openshift-console-operator(79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" podUID="79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8" Apr 23 17:43:44.362289 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:44.362256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:44.362456 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:44.362398 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:43:44.362456 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:44.362455 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls podName:a57dc3a2-8d1a-4796-b085-1a4dcb8cd875 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:48.362440832 +0000 UTC m=+136.292453451 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vdnk8" (UID: "a57dc3a2-8d1a-4796-b085-1a4dcb8cd875") : secret "samples-operator-tls" not found Apr 23 17:43:45.006641 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:45.006600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:43:45.007149 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:45.007129 2576 scope.go:117] "RemoveContainer" containerID="d8af809606705a0d0bf47fdb6bcaaa8cd0d0b58eaff4bc98e4d6e17e58aa3c33" Apr 23 17:43:45.007373 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:45.007350 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5nzks_openshift-console-operator(79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" podUID="79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8" Apr 23 17:43:45.269927 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:45.269846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:45.270078 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:45.270020 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:45.270078 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:45.270039 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684c867d65-lhpfq: secret "image-registry-tls" not found Apr 23 17:43:45.270178 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:45.270099 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls podName:0c7736d1-09e9-4b6a-a2e0-a5675df75c37 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:49.270080042 +0000 UTC m=+137.200092674 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls") pod "image-registry-684c867d65-lhpfq" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37") : secret "image-registry-tls" not found Apr 23 17:43:46.010213 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:46.010177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" event={"ID":"c1174d10-c6be-499b-bba1-9efb0ba75fac","Type":"ContainerStarted","Data":"492d8a6f3d2f45e7d32efb3dd81063d01f30b435d26e6ec775ea275215803ec4"} Apr 23 17:43:46.033070 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:46.033022 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" podStartSLOduration=1.292633886 podStartE2EDuration="3.033006998s" podCreationTimestamp="2026-04-23 17:43:43 +0000 UTC" firstStartedPulling="2026-04-23 17:43:43.799247215 +0000 UTC m=+131.729259829" lastFinishedPulling="2026-04-23 17:43:45.539620328 +0000 UTC m=+133.469632941" observedRunningTime="2026-04-23 17:43:46.031508396 +0000 UTC m=+133.961521032" watchObservedRunningTime="2026-04-23 17:43:46.033006998 +0000 UTC m=+133.963019634" Apr 23 17:43:46.725632 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:46.725606 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v6skv_2928f4e6-28bf-471f-bb81-513b3e161d32/dns-node-resolver/0.log" Apr 23 17:43:47.325017 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:47.324986 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4f55z_19ad9566-830f-4ba3-bed2-db16fce5cd6a/node-ca/0.log" Apr 23 17:43:48.393826 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:48.393793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:48.394223 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:48.393919 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:43:48.394223 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:48.393972 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls podName:a57dc3a2-8d1a-4796-b085-1a4dcb8cd875 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:56.393958581 +0000 UTC m=+144.323971195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vdnk8" (UID: "a57dc3a2-8d1a-4796-b085-1a4dcb8cd875") : secret "samples-operator-tls" not found Apr 23 17:43:49.301435 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:49.301384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:49.301634 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:49.301532 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:49.301634 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:49.301554 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684c867d65-lhpfq: secret "image-registry-tls" not found Apr 23 17:43:49.301634 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:49.301606 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls podName:0c7736d1-09e9-4b6a-a2e0-a5675df75c37 nodeName:}" failed. No retries permitted until 2026-04-23 17:43:57.301590171 +0000 UTC m=+145.231602785 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls") pod "image-registry-684c867d65-lhpfq" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37") : secret "image-registry-tls" not found Apr 23 17:43:49.525347 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:49.525320 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v7r4_f1d4eda8-73aa-4336-8bc4-dd2b15195cac/csi-driver/0.log" Apr 23 17:43:49.724398 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:49.724372 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v7r4_f1d4eda8-73aa-4336-8bc4-dd2b15195cac/csi-node-driver-registrar/0.log" Apr 23 17:43:49.925052 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:49.925015 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v7r4_f1d4eda8-73aa-4336-8bc4-dd2b15195cac/csi-liveness-probe/0.log" Apr 23 17:43:50.902942 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:50.902912 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:50.902942 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:50.902947 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:43:50.903370 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:50.903308 2576 scope.go:117] "RemoveContainer" containerID="d8af809606705a0d0bf47fdb6bcaaa8cd0d0b58eaff4bc98e4d6e17e58aa3c33" Apr 23 17:43:50.903490 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:50.903470 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5nzks_openshift-console-operator(79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" podUID="79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8" Apr 23 17:43:56.456024 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:56.455987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:56.458386 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:56.458363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57dc3a2-8d1a-4796-b085-1a4dcb8cd875-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vdnk8\" (UID: \"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:56.510276 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:56.510238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" Apr 23 17:43:56.628779 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:56.628744 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8"] Apr 23 17:43:57.031367 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:57.031324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" event={"ID":"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875","Type":"ContainerStarted","Data":"dbb771971aa9325a29a34c881004e4759b0f311c742fb628d128594f64633949"} Apr 23 17:43:57.363262 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:57.363217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:43:57.363443 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:57.363370 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:43:57.363443 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:57.363394 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684c867d65-lhpfq: secret "image-registry-tls" not found Apr 23 17:43:57.363534 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:43:57.363456 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls podName:0c7736d1-09e9-4b6a-a2e0-a5675df75c37 nodeName:}" failed. No retries permitted until 2026-04-23 17:44:13.363440238 +0000 UTC m=+161.293452853 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls") pod "image-registry-684c867d65-lhpfq" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37") : secret "image-registry-tls" not found Apr 23 17:43:59.037659 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:59.037613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" event={"ID":"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875","Type":"ContainerStarted","Data":"c0c9ac1839d37382287614b90e90cae461f26eb76c7408620d47c8acda31cfba"} Apr 23 17:43:59.037659 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:59.037652 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" event={"ID":"a57dc3a2-8d1a-4796-b085-1a4dcb8cd875","Type":"ContainerStarted","Data":"49ecd70c360a86aea5a51145cff89308cdaaeeaead37d060dcbce42ab4ab0eba"} Apr 23 17:43:59.058282 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:43:59.058239 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vdnk8" podStartSLOduration=17.670403475 podStartE2EDuration="19.05822502s" podCreationTimestamp="2026-04-23 17:43:40 +0000 UTC" firstStartedPulling="2026-04-23 17:43:56.668938318 +0000 UTC m=+144.598950932" lastFinishedPulling="2026-04-23 17:43:58.056759849 +0000 UTC m=+145.986772477" observedRunningTime="2026-04-23 17:43:59.057183354 +0000 UTC m=+146.987195992" watchObservedRunningTime="2026-04-23 17:43:59.05822502 +0000 UTC m=+146.988237655" Apr 23 17:44:05.643979 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:05.643945 2576 scope.go:117] "RemoveContainer" containerID="d8af809606705a0d0bf47fdb6bcaaa8cd0d0b58eaff4bc98e4d6e17e58aa3c33" Apr 23 17:44:06.055257 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:06.055178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:44:06.055257 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:06.055253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" event={"ID":"79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8","Type":"ContainerStarted","Data":"b57b102124578a7ad410be9b99fdf0cf36f7ff5897a9ac0639a65425ce541ea2"} Apr 23 17:44:06.055533 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:06.055511 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:44:06.078815 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:06.078756 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" podStartSLOduration=24.56860443 podStartE2EDuration="26.078742252s" podCreationTimestamp="2026-04-23 17:43:40 +0000 UTC" firstStartedPulling="2026-04-23 17:43:41.024290492 +0000 UTC m=+128.954303105" lastFinishedPulling="2026-04-23 17:43:42.534428309 +0000 UTC m=+130.464440927" observedRunningTime="2026-04-23 17:44:06.0770067 +0000 UTC m=+154.007019336" watchObservedRunningTime="2026-04-23 17:44:06.078742252 +0000 UTC m=+154.008754887" Apr 23 17:44:06.503055 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:06.503027 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-5nzks" Apr 23 17:44:08.502663 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:44:08.502618 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fxb9d" podUID="9fad7629-2a8f-44c8-8668-437d00f77bca" Apr 23 17:44:08.508759 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:44:08.508709 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-z8pdq" podUID="1714203d-7df7-4a8f-8d58-69bc1d7062f4" Apr 23 17:44:09.064057 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:09.064028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:44:09.064232 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:09.064030 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fxb9d" Apr 23 17:44:09.656600 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:44:09.656554 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-vjjmx" podUID="1b961de5-fea1-4bac-9c17-d8682d9a4242" Apr 23 17:44:13.376214 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.376172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:44:13.376214 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.376220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:44:13.378626 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.378597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fad7629-2a8f-44c8-8668-437d00f77bca-metrics-tls\") pod \"dns-default-fxb9d\" (UID: \"9fad7629-2a8f-44c8-8668-437d00f77bca\") " pod="openshift-dns/dns-default-fxb9d" Apr 23 17:44:13.378748 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.378658 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"image-registry-684c867d65-lhpfq\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:44:13.477080 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.477033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:44:13.479349 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.479330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1714203d-7df7-4a8f-8d58-69bc1d7062f4-cert\") pod \"ingress-canary-z8pdq\" (UID: \"1714203d-7df7-4a8f-8d58-69bc1d7062f4\") " pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:44:13.568104 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.568067 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-576jd\"" Apr 23 17:44:13.569051 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.569033 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zw6qq\"" Apr 23 17:44:13.575275 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.575253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fxb9d" Apr 23 17:44:13.575370 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.575340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z8pdq" Apr 23 17:44:13.586214 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.586187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:44:13.713484 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.713454 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z8pdq"] Apr 23 17:44:13.717590 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:44:13.717552 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1714203d_7df7_4a8f_8d58_69bc1d7062f4.slice/crio-1d9a0d75c75ef077d26212753bb933df86cb9bbc47d9c4a8022e0fb059b788f5 WatchSource:0}: Error finding container 1d9a0d75c75ef077d26212753bb933df86cb9bbc47d9c4a8022e0fb059b788f5: Status 404 returned error can't find the container with id 1d9a0d75c75ef077d26212753bb933df86cb9bbc47d9c4a8022e0fb059b788f5 Apr 23 17:44:13.942633 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.942556 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fxb9d"] Apr 23 17:44:13.945589 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:44:13.945549 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fad7629_2a8f_44c8_8668_437d00f77bca.slice/crio-22e8071f7bbde5668c2fcb1234c1b84617114f13747c057a8b9e50435545b27c WatchSource:0}: Error finding container 22e8071f7bbde5668c2fcb1234c1b84617114f13747c057a8b9e50435545b27c: Status 404 returned error can't find the container with id 22e8071f7bbde5668c2fcb1234c1b84617114f13747c057a8b9e50435545b27c Apr 23 17:44:13.945764 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:13.945731 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-684c867d65-lhpfq"] Apr 23 17:44:13.949124 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:44:13.949096 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c7736d1_09e9_4b6a_a2e0_a5675df75c37.slice/crio-078718bed934975d0bf936722f642afdaf84b91a5101fa311c084348deb7a8c0 WatchSource:0}: Error finding container 078718bed934975d0bf936722f642afdaf84b91a5101fa311c084348deb7a8c0: Status 404 returned error can't find the container with id 078718bed934975d0bf936722f642afdaf84b91a5101fa311c084348deb7a8c0 Apr 23 17:44:14.077797 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:14.077758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" event={"ID":"0c7736d1-09e9-4b6a-a2e0-a5675df75c37","Type":"ContainerStarted","Data":"6c311c7ec09a2d97f1174e023ceaae912e3a9a10299d415173474837f42d3624"} Apr 23 17:44:14.077997 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:14.077806 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" event={"ID":"0c7736d1-09e9-4b6a-a2e0-a5675df75c37","Type":"ContainerStarted","Data":"078718bed934975d0bf936722f642afdaf84b91a5101fa311c084348deb7a8c0"} Apr 23 17:44:14.077997 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:14.077840 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:44:14.078902 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:14.078878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z8pdq" event={"ID":"1714203d-7df7-4a8f-8d58-69bc1d7062f4","Type":"ContainerStarted","Data":"1d9a0d75c75ef077d26212753bb933df86cb9bbc47d9c4a8022e0fb059b788f5"} Apr 23 17:44:14.079902 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:14.079882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxb9d" event={"ID":"9fad7629-2a8f-44c8-8668-437d00f77bca","Type":"ContainerStarted","Data":"22e8071f7bbde5668c2fcb1234c1b84617114f13747c057a8b9e50435545b27c"} Apr 23 17:44:14.099349 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:14.099290 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podStartSLOduration=33.099270508 podStartE2EDuration="33.099270508s" podCreationTimestamp="2026-04-23 17:43:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:44:14.098768926 +0000 UTC m=+162.028781563" watchObservedRunningTime="2026-04-23 17:44:14.099270508 +0000 UTC m=+162.029283145" Apr 23 17:44:16.086655 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:16.086622 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z8pdq" event={"ID":"1714203d-7df7-4a8f-8d58-69bc1d7062f4","Type":"ContainerStarted","Data":"73bfe2a25016e7f9c202adf70be25b42792f79f783f4cf6c265855e8a57bb6ea"} Apr 23 17:44:16.088160 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:16.088136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxb9d" event={"ID":"9fad7629-2a8f-44c8-8668-437d00f77bca","Type":"ContainerStarted","Data":"0efca9bbc5ccd5581e8739b3477b13fa93569d7b5014b495683223956972165f"} Apr 23 17:44:16.088256 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:16.088169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxb9d" event={"ID":"9fad7629-2a8f-44c8-8668-437d00f77bca","Type":"ContainerStarted","Data":"6404fde11316ef8d9d146d037147f79e9b685b9b47ef2adaf66c169150ce57df"} Apr 23 17:44:16.088256 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:16.088246 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fxb9d" Apr 23 17:44:16.102848 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:16.102800 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z8pdq" podStartSLOduration=129.155829424 podStartE2EDuration="2m11.102784629s" podCreationTimestamp="2026-04-23 17:42:05 +0000 UTC" firstStartedPulling="2026-04-23 17:44:13.719386736 +0000 UTC m=+161.649399349" lastFinishedPulling="2026-04-23 17:44:15.666341939 +0000 UTC m=+163.596354554" observedRunningTime="2026-04-23 17:44:16.102686761 +0000 UTC m=+164.032699398" watchObservedRunningTime="2026-04-23 17:44:16.102784629 +0000 UTC m=+164.032797256" Apr 23 17:44:16.123117 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:16.123067 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fxb9d" podStartSLOduration=129.403302982 podStartE2EDuration="2m11.123052974s" podCreationTimestamp="2026-04-23 17:42:05 +0000 UTC" firstStartedPulling="2026-04-23 17:44:13.947697396 +0000 UTC m=+161.877710010" lastFinishedPulling="2026-04-23 17:44:15.667447387 +0000 UTC m=+163.597460002" observedRunningTime="2026-04-23 17:44:16.122443907 +0000 UTC m=+164.052456544" watchObservedRunningTime="2026-04-23 17:44:16.123052974 +0000 UTC m=+164.053065610" Apr 23 17:44:20.643777 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:20.643691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:44:26.093317 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:26.093287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fxb9d" Apr 23 17:44:33.590566 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:33.590529 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:33.590957 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:33.590594 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:35.086334 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:35.086298 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:35.086783 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:35.086365 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:43.590612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:43.590574 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:43.591201 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:43.590647 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:45.086358 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:45.086327 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:45.086739 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:45.086385 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:53.590342 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:53.590309 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:53.590760 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:53.590370 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:44:53.590760 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:53.590408 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:44:53.590967 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:53.590931 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"6c311c7ec09a2d97f1174e023ceaae912e3a9a10299d415173474837f42d3624"} pod="openshift-image-registry/image-registry-684c867d65-lhpfq" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:44:53.594270 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:53.594243 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:44:53.594359 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:44:53.594295 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:02.219987 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:02.219948 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:45:02.420074 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:02.420038 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/2.log" Apr 23 17:45:03.018255 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:03.018228 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fxb9d_9fad7629-2a8f-44c8-8668-437d00f77bca/dns/0.log" Apr 23 17:45:03.217893 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:03.217864 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fxb9d_9fad7629-2a8f-44c8-8668-437d00f77bca/kube-rbac-proxy/0.log" Apr 23 17:45:03.594543 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:03.594513 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:03.594929 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:03.594570 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:04.220020 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:04.219991 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v6skv_2928f4e6-28bf-471f-bb81-513b3e161d32/dns-node-resolver/0.log" Apr 23 17:45:04.419186 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:04.419158 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-684c867d65-lhpfq_0c7736d1-09e9-4b6a-a2e0-a5675df75c37/registry/0.log" Apr 23 17:45:04.817221 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:04.817186 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4f55z_19ad9566-830f-4ba3-bed2-db16fce5cd6a/node-ca/0.log" Apr 23 17:45:06.018376 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:06.018337 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z8pdq_1714203d-7df7-4a8f-8d58-69bc1d7062f4/serve-healthcheck-canary/0.log" Apr 23 17:45:07.017850 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:07.017824 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v7r4_f1d4eda8-73aa-4336-8bc4-dd2b15195cac/csi-driver/0.log" Apr 23 17:45:07.218036 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:07.218005 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v7r4_f1d4eda8-73aa-4336-8bc4-dd2b15195cac/csi-node-driver-registrar/0.log" Apr 23 17:45:07.417989 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:07.417960 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v7r4_f1d4eda8-73aa-4336-8bc4-dd2b15195cac/csi-liveness-probe/0.log" Apr 23 17:45:11.226880 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:11.226849 2576 generic.go:358] "Generic (PLEG): container finished" podID="c1174d10-c6be-499b-bba1-9efb0ba75fac" containerID="492d8a6f3d2f45e7d32efb3dd81063d01f30b435d26e6ec775ea275215803ec4" exitCode=0 Apr 23 17:45:11.227331 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:11.226892 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" event={"ID":"c1174d10-c6be-499b-bba1-9efb0ba75fac","Type":"ContainerDied","Data":"492d8a6f3d2f45e7d32efb3dd81063d01f30b435d26e6ec775ea275215803ec4"} Apr 23 17:45:11.227331 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:11.227248 2576 scope.go:117] "RemoveContainer" containerID="492d8a6f3d2f45e7d32efb3dd81063d01f30b435d26e6ec775ea275215803ec4" Apr 23 17:45:12.231084 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:12.231053 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6nd9l" event={"ID":"c1174d10-c6be-499b-bba1-9efb0ba75fac","Type":"ContainerStarted","Data":"76e66bb4a142fc977e5d66754be2e9083600f98b60e98dbcb92417d7d2b705b2"} Apr 23 17:45:13.594381 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:13.594347 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:13.594781 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:13.594396 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:18.608697 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:18.608653 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" containerID="cri-o://6c311c7ec09a2d97f1174e023ceaae912e3a9a10299d415173474837f42d3624" gracePeriod=30 Apr 23 17:45:19.248594 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:19.248560 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerID="6c311c7ec09a2d97f1174e023ceaae912e3a9a10299d415173474837f42d3624" exitCode=0 Apr 23 17:45:19.248785 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:19.248603 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" event={"ID":"0c7736d1-09e9-4b6a-a2e0-a5675df75c37","Type":"ContainerDied","Data":"6c311c7ec09a2d97f1174e023ceaae912e3a9a10299d415173474837f42d3624"} Apr 23 17:45:19.248785 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:19.248628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" event={"ID":"0c7736d1-09e9-4b6a-a2e0-a5675df75c37","Type":"ContainerStarted","Data":"461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b"} Apr 23 17:45:19.248878 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:19.248800 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:45:33.590256 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:33.590218 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:33.590614 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:33.590275 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:40.255648 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:40.255615 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:40.256033 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:40.255663 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:43.590684 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:43.590648 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:43.591166 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:43.590742 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:44.528461 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:44.528410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:45:44.530664 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:44.530646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b961de5-fea1-4bac-9c17-d8682d9a4242-metrics-certs\") pod \"network-metrics-daemon-vjjmx\" (UID: \"1b961de5-fea1-4bac-9c17-d8682d9a4242\") " pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:45:44.647934 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:44.647901 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdbsc\"" Apr 23 17:45:44.655620 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:44.655603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjjmx" Apr 23 17:45:44.769366 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:44.769335 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vjjmx"] Apr 23 17:45:44.772750 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:45:44.772708 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b961de5_fea1_4bac_9c17_d8682d9a4242.slice/crio-cb0efaba197e3ea1d9bfc9fbef85205ac18ca6d37a820a21d9245b30b5cc5d97 WatchSource:0}: Error finding container cb0efaba197e3ea1d9bfc9fbef85205ac18ca6d37a820a21d9245b30b5cc5d97: Status 404 returned error can't find the container with id cb0efaba197e3ea1d9bfc9fbef85205ac18ca6d37a820a21d9245b30b5cc5d97 Apr 23 17:45:45.313735 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:45.313685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjjmx" event={"ID":"1b961de5-fea1-4bac-9c17-d8682d9a4242","Type":"ContainerStarted","Data":"cb0efaba197e3ea1d9bfc9fbef85205ac18ca6d37a820a21d9245b30b5cc5d97"} Apr 23 17:45:46.318148 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:46.318111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjjmx" event={"ID":"1b961de5-fea1-4bac-9c17-d8682d9a4242","Type":"ContainerStarted","Data":"e87dc205437b2f8961bbd066378f3bea37c41fea338b50419b89c8dc96e79dfb"} Apr 23 17:45:46.318148 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:46.318154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjjmx" event={"ID":"1b961de5-fea1-4bac-9c17-d8682d9a4242","Type":"ContainerStarted","Data":"abc28f71ad0166db20119efbdb1e70f01f27a0cbef13b04a34bf325396869594"} Apr 23 17:45:46.338583 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:46.338517 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vjjmx" podStartSLOduration=253.416483614 podStartE2EDuration="4m14.338498333s" podCreationTimestamp="2026-04-23 17:41:32 +0000 UTC" firstStartedPulling="2026-04-23 17:45:44.774573949 +0000 UTC m=+252.704586563" lastFinishedPulling="2026-04-23 17:45:45.696588667 +0000 UTC m=+253.626601282" observedRunningTime="2026-04-23 17:45:46.337400997 +0000 UTC m=+254.267413667" watchObservedRunningTime="2026-04-23 17:45:46.338498333 +0000 UTC m=+254.268510970" Apr 23 17:45:50.255445 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:50.255360 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:50.255445 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:50.255415 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:53.590894 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:53.590859 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:53.591362 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:53.590916 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:45:53.591362 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:53.590960 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:45:53.591529 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:53.591507 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b"} pod="openshift-image-registry/image-registry-684c867d65-lhpfq" containerMessage="Container registry failed liveness probe, will be restarted" Apr 23 17:45:53.594587 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:53.594557 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:45:53.594701 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:45:53.594615 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:03.595309 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:03.595273 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:03.595782 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:03.595334 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:13.594833 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:13.594791 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:13.595198 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:13.594846 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:18.609154 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:18.609111 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" containerID="cri-o://461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b" gracePeriod=30 Apr 23 17:46:19.400085 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:19.400053 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerID="461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b" exitCode=0 Apr 23 17:46:19.400294 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:19.400133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" event={"ID":"0c7736d1-09e9-4b6a-a2e0-a5675df75c37","Type":"ContainerDied","Data":"461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b"} Apr 23 17:46:19.400294 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:19.400180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" event={"ID":"0c7736d1-09e9-4b6a-a2e0-a5675df75c37","Type":"ContainerStarted","Data":"e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05"} Apr 23 17:46:19.400294 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:19.400200 2576 scope.go:117] "RemoveContainer" containerID="6c311c7ec09a2d97f1174e023ceaae912e3a9a10299d415173474837f42d3624" Apr 23 17:46:19.400437 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:19.400331 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:46:32.517258 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:32.517229 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:46:32.517709 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:32.517585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:46:32.521003 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:32.520973 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:46:32.521376 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:32.521343 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:46:32.526687 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:32.526669 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 17:46:33.590138 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:33.590102 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:33.592436 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:33.590163 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:40.408807 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:40.408773 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:40.409333 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:40.408830 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:42.486601 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.486566 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-684c867d65-lhpfq"] Apr 23 17:46:42.492685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.492621 2576 patch_prober.go:28] interesting pod/image-registry-684c867d65-lhpfq container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:46:42.492960 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.492930 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:46:42.494298 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.494264 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-b85zj"] Apr 23 17:46:42.497483 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.497444 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb"] Apr 23 17:46:42.497606 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.497588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-b85zj" Apr 23 17:46:42.500459 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.500437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" Apr 23 17:46:42.501034 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.501015 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-fjb29\"" Apr 23 17:46:42.502087 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.502065 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 17:46:42.502184 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.502119 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 17:46:42.503176 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.503159 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-vrsmt\"" Apr 23 17:46:42.503319 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.503299 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 17:46:42.515901 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.515878 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb"] Apr 23 17:46:42.519092 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.519071 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-b85zj"] Apr 23 17:46:42.584046 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.584017 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f99d9d489-gqwc4"] Apr 23 17:46:42.586971 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.586953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.605952 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.605928 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dpxlz"] Apr 23 17:46:42.609077 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.609056 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f99d9d489-gqwc4"] Apr 23 17:46:42.609194 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.609162 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.614738 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.614704 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:46:42.616314 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.616293 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-j9sps\"" Apr 23 17:46:42.616423 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.616393 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:46:42.616471 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.616459 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:46:42.618811 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.618797 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:46:42.633123 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0c42f94-a436-4dab-83fb-df3e1012e238-trusted-ca\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.633275 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-bound-sa-token\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.633275 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt8dd\" (UniqueName: \"kubernetes.io/projected/e4ed6eb2-5d97-4853-bf31-2c8cae882d07-kube-api-access-kt8dd\") pod \"downloads-6bcc868b7-b85zj\" (UID: \"e4ed6eb2-5d97-4853-bf31-2c8cae882d07\") " pod="openshift-console/downloads-6bcc868b7-b85zj" Apr 23 17:46:42.633275 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d0c42f94-a436-4dab-83fb-df3e1012e238-registry-certificates\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.633421 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-registry-tls\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.633421 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d41989e-e48d-451f-8814-4ab5c5096935-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2jncb\" (UID: \"4d41989e-e48d-451f-8814-4ab5c5096935\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" Apr 23 17:46:42.633421 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d0c42f94-a436-4dab-83fb-df3e1012e238-installation-pull-secrets\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.633421 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d0c42f94-a436-4dab-83fb-df3e1012e238-image-registry-private-configuration\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.633421 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d0c42f94-a436-4dab-83fb-df3e1012e238-ca-trust-extracted\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.633595 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.633453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rglx4\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-kube-api-access-rglx4\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.640802 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.640780 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dpxlz"] Apr 23 17:46:42.734507 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6f69\" (UniqueName: \"kubernetes.io/projected/93990938-3621-4020-90b7-3824b5530537-kube-api-access-k6f69\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.734685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d0c42f94-a436-4dab-83fb-df3e1012e238-registry-certificates\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.734685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-registry-tls\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.734685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d41989e-e48d-451f-8814-4ab5c5096935-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2jncb\" (UID: \"4d41989e-e48d-451f-8814-4ab5c5096935\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" Apr 23 17:46:42.734685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d0c42f94-a436-4dab-83fb-df3e1012e238-installation-pull-secrets\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.734685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d0c42f94-a436-4dab-83fb-df3e1012e238-image-registry-private-configuration\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.734685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93990938-3621-4020-90b7-3824b5530537-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.734685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d0c42f94-a436-4dab-83fb-df3e1012e238-ca-trust-extracted\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.734685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rglx4\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-kube-api-access-rglx4\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.735123 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.734698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93990938-3621-4020-90b7-3824b5530537-crio-socket\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.735123 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.735031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0c42f94-a436-4dab-83fb-df3e1012e238-trusted-ca\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.735123 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.735074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93990938-3621-4020-90b7-3824b5530537-data-volume\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.735274 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.735120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d0c42f94-a436-4dab-83fb-df3e1012e238-ca-trust-extracted\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.735274 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.735162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93990938-3621-4020-90b7-3824b5530537-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.735274 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.735200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-bound-sa-token\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.735274 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.735243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt8dd\" (UniqueName: \"kubernetes.io/projected/e4ed6eb2-5d97-4853-bf31-2c8cae882d07-kube-api-access-kt8dd\") pod \"downloads-6bcc868b7-b85zj\" (UID: \"e4ed6eb2-5d97-4853-bf31-2c8cae882d07\") " pod="openshift-console/downloads-6bcc868b7-b85zj" Apr 23 17:46:42.735514 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.735488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d0c42f94-a436-4dab-83fb-df3e1012e238-registry-certificates\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.736054 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.736033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0c42f94-a436-4dab-83fb-df3e1012e238-trusted-ca\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.737337 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.737284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d0c42f94-a436-4dab-83fb-df3e1012e238-image-registry-private-configuration\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.737409 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.737367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d41989e-e48d-451f-8814-4ab5c5096935-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2jncb\" (UID: \"4d41989e-e48d-451f-8814-4ab5c5096935\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" Apr 23 17:46:42.737731 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.737697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d0c42f94-a436-4dab-83fb-df3e1012e238-installation-pull-secrets\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.738159 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.738143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-registry-tls\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.756524 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.756490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt8dd\" (UniqueName: \"kubernetes.io/projected/e4ed6eb2-5d97-4853-bf31-2c8cae882d07-kube-api-access-kt8dd\") pod \"downloads-6bcc868b7-b85zj\" (UID: \"e4ed6eb2-5d97-4853-bf31-2c8cae882d07\") " pod="openshift-console/downloads-6bcc868b7-b85zj" Apr 23 17:46:42.757174 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.757154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-bound-sa-token\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.757356 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.757336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rglx4\" (UniqueName: \"kubernetes.io/projected/d0c42f94-a436-4dab-83fb-df3e1012e238-kube-api-access-rglx4\") pod \"image-registry-6f99d9d489-gqwc4\" (UID: \"d0c42f94-a436-4dab-83fb-df3e1012e238\") " pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.808422 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.808382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-b85zj" Apr 23 17:46:42.813211 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.813187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" Apr 23 17:46:42.836096 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.836063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93990938-3621-4020-90b7-3824b5530537-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.836096 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.836110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93990938-3621-4020-90b7-3824b5530537-crio-socket\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.836332 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.836132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93990938-3621-4020-90b7-3824b5530537-data-volume\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.836332 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.836159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93990938-3621-4020-90b7-3824b5530537-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.836332 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.836194 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93990938-3621-4020-90b7-3824b5530537-crio-socket\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.836332 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.836205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6f69\" (UniqueName: \"kubernetes.io/projected/93990938-3621-4020-90b7-3824b5530537-kube-api-access-k6f69\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.837370 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.837332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93990938-3621-4020-90b7-3824b5530537-data-volume\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.837652 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.837624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93990938-3621-4020-90b7-3824b5530537-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.838619 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.838594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93990938-3621-4020-90b7-3824b5530537-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.845979 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.845892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6f69\" (UniqueName: \"kubernetes.io/projected/93990938-3621-4020-90b7-3824b5530537-kube-api-access-k6f69\") pod \"insights-runtime-extractor-dpxlz\" (UID: \"93990938-3621-4020-90b7-3824b5530537\") " pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.895389 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.895279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:42.918372 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.918010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dpxlz" Apr 23 17:46:42.951045 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.950978 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-b85zj"] Apr 23 17:46:42.957471 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:46:42.957364 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ed6eb2_5d97_4853_bf31_2c8cae882d07.slice/crio-bf3b336c4a47466afe275684630aca15a5713e9948060ab74fdb7a9d86a5ccdb WatchSource:0}: Error finding container bf3b336c4a47466afe275684630aca15a5713e9948060ab74fdb7a9d86a5ccdb: Status 404 returned error can't find the container with id bf3b336c4a47466afe275684630aca15a5713e9948060ab74fdb7a9d86a5ccdb Apr 23 17:46:42.958913 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.958895 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:46:42.966064 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:42.966039 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb"] Apr 23 17:46:42.972168 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:46:42.970951 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d41989e_e48d_451f_8814_4ab5c5096935.slice/crio-965741f09ea616e5bc60fd2346b66ee914b31fda0654a89f83baa48bd730a2ca WatchSource:0}: Error finding container 965741f09ea616e5bc60fd2346b66ee914b31fda0654a89f83baa48bd730a2ca: Status 404 returned error can't find the container with id 965741f09ea616e5bc60fd2346b66ee914b31fda0654a89f83baa48bd730a2ca Apr 23 17:46:43.033139 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.033110 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f99d9d489-gqwc4"] Apr 23 17:46:43.036285 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:46:43.036255 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0c42f94_a436_4dab_83fb_df3e1012e238.slice/crio-e8c6786dfbbdcaa64713faaaf08582740e80ac3d89a41d08b6aaf4a1e3b70ca2 WatchSource:0}: Error finding container e8c6786dfbbdcaa64713faaaf08582740e80ac3d89a41d08b6aaf4a1e3b70ca2: Status 404 returned error can't find the container with id e8c6786dfbbdcaa64713faaaf08582740e80ac3d89a41d08b6aaf4a1e3b70ca2 Apr 23 17:46:43.063708 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.063679 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dpxlz"] Apr 23 17:46:43.070128 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:46:43.070088 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93990938_3621_4020_90b7_3824b5530537.slice/crio-af64e6fb489dc9221b9f27728700ee82bd5ec96c4af78efea2ee205d46165123 WatchSource:0}: Error finding container af64e6fb489dc9221b9f27728700ee82bd5ec96c4af78efea2ee205d46165123: Status 404 returned error can't find the container with id af64e6fb489dc9221b9f27728700ee82bd5ec96c4af78efea2ee205d46165123 Apr 23 17:46:43.468112 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.468066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" event={"ID":"4d41989e-e48d-451f-8814-4ab5c5096935","Type":"ContainerStarted","Data":"965741f09ea616e5bc60fd2346b66ee914b31fda0654a89f83baa48bd730a2ca"} Apr 23 17:46:43.470429 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.470394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" event={"ID":"d0c42f94-a436-4dab-83fb-df3e1012e238","Type":"ContainerStarted","Data":"2245b6b408b71b085860d81df8e9a304dbd8a5eda9968244d04c4ae1f0744891"} Apr 23 17:46:43.470574 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.470436 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" event={"ID":"d0c42f94-a436-4dab-83fb-df3e1012e238","Type":"ContainerStarted","Data":"e8c6786dfbbdcaa64713faaaf08582740e80ac3d89a41d08b6aaf4a1e3b70ca2"} Apr 23 17:46:43.471208 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.471185 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:46:43.473217 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.473130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dpxlz" event={"ID":"93990938-3621-4020-90b7-3824b5530537","Type":"ContainerStarted","Data":"5e7d1b5a2c20bb6ddb22c847c126c188d0512ad0752ac3b4e2200744b346cef7"} Apr 23 17:46:43.473217 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.473161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dpxlz" event={"ID":"93990938-3621-4020-90b7-3824b5530537","Type":"ContainerStarted","Data":"af64e6fb489dc9221b9f27728700ee82bd5ec96c4af78efea2ee205d46165123"} Apr 23 17:46:43.474344 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.474317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-b85zj" event={"ID":"e4ed6eb2-5d97-4853-bf31-2c8cae882d07","Type":"ContainerStarted","Data":"bf3b336c4a47466afe275684630aca15a5713e9948060ab74fdb7a9d86a5ccdb"} Apr 23 17:46:43.498990 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:43.498920 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" podStartSLOduration=1.498900334 podStartE2EDuration="1.498900334s" podCreationTimestamp="2026-04-23 17:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:46:43.498182574 +0000 UTC m=+311.428195213" watchObservedRunningTime="2026-04-23 17:46:43.498900334 +0000 UTC m=+311.428912964" Apr 23 17:46:44.480681 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:44.480640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" event={"ID":"4d41989e-e48d-451f-8814-4ab5c5096935","Type":"ContainerStarted","Data":"4f887949942229f2d36674d5ec4136b5f0c8afc4af5b0d3ccae53f7b57e35845"} Apr 23 17:46:44.480892 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:44.480819 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" Apr 23 17:46:44.483219 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:44.483147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dpxlz" event={"ID":"93990938-3621-4020-90b7-3824b5530537","Type":"ContainerStarted","Data":"c6217445373ada4cc2940d9253af8db2bd9ac5a2c49583137c4b4add7427a044"} Apr 23 17:46:44.487477 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:44.487454 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" Apr 23 17:46:44.501998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:44.501897 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2jncb" podStartSLOduration=1.4492584050000001 podStartE2EDuration="2.501881371s" podCreationTimestamp="2026-04-23 17:46:42 +0000 UTC" firstStartedPulling="2026-04-23 17:46:42.973305378 +0000 UTC m=+310.903317992" lastFinishedPulling="2026-04-23 17:46:44.02592833 +0000 UTC m=+311.955940958" observedRunningTime="2026-04-23 17:46:44.499802694 +0000 UTC m=+312.429815331" watchObservedRunningTime="2026-04-23 17:46:44.501881371 +0000 UTC m=+312.431894009" Apr 23 17:46:46.497906 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:46.497865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dpxlz" event={"ID":"93990938-3621-4020-90b7-3824b5530537","Type":"ContainerStarted","Data":"f37a837bc9c266db3b7b582bf8ec5c6240e93d8c746eadd2bffeed4d5eb77512"} Apr 23 17:46:49.806002 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.805893 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dpxlz" podStartSLOduration=5.38773596 podStartE2EDuration="7.805874586s" podCreationTimestamp="2026-04-23 17:46:42 +0000 UTC" firstStartedPulling="2026-04-23 17:46:43.116540688 +0000 UTC m=+311.046553305" lastFinishedPulling="2026-04-23 17:46:45.534679313 +0000 UTC m=+313.464691931" observedRunningTime="2026-04-23 17:46:46.537583539 +0000 UTC m=+314.467596175" watchObservedRunningTime="2026-04-23 17:46:49.805874586 +0000 UTC m=+317.735887224" Apr 23 17:46:49.806482 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.806325 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vn6j"] Apr 23 17:46:49.808480 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.808459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.813677 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.813656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:46:49.814711 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.814680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-qgcxr\"" Apr 23 17:46:49.814711 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.814703 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:46:49.814917 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.814704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 17:46:49.814917 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.814687 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 17:46:49.814917 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.814710 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 17:46:49.817648 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.817633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:46:49.837800 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.837773 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vn6j"] Apr 23 17:46:49.846530 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.846495 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-swxxb"] Apr 23 17:46:49.848815 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.848795 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.852460 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.852276 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:46:49.852460 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.852335 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:46:49.852733 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.852585 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:46:49.852733 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.852665 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hb422\"" Apr 23 17:46:49.894309 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.894267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/de127a70-49dc-4497-bc07-40fa5216e03b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.894558 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.894384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.894558 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.894441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de127a70-49dc-4497-bc07-40fa5216e03b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.894664 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.894619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.894775 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.894757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.894820 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.894792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcdt2\" (UniqueName: \"kubernetes.io/projected/de127a70-49dc-4497-bc07-40fa5216e03b-kube-api-access-fcdt2\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.996187 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.996371 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcdt2\" (UniqueName: \"kubernetes.io/projected/de127a70-49dc-4497-bc07-40fa5216e03b-kube-api-access-fcdt2\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.996371 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/de127a70-49dc-4497-bc07-40fa5216e03b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.996371 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-textfile\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996371 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-tls\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996371 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-root\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.996621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-wtmp\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-accelerators-collector-config\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-sys\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de127a70-49dc-4497-bc07-40fa5216e03b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.996621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.996621 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlcmk\" (UniqueName: \"kubernetes.io/projected/f075f196-9caf-4281-83b3-edf93558d8f7-kube-api-access-rlcmk\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996936 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f075f196-9caf-4281-83b3-edf93558d8f7-metrics-client-ca\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:49.996936 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.996730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/de127a70-49dc-4497-bc07-40fa5216e03b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.996936 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:46:49.996769 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 17:46:49.996936 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:46:49.996834 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-tls podName:de127a70-49dc-4497-bc07-40fa5216e03b nodeName:}" failed. No retries permitted until 2026-04-23 17:46:50.496813222 +0000 UTC m=+318.426825854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-7vn6j" (UID: "de127a70-49dc-4497-bc07-40fa5216e03b") : secret "kube-state-metrics-tls" not found Apr 23 17:46:49.997129 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.997049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.997235 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.997216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de127a70-49dc-4497-bc07-40fa5216e03b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:49.999576 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:49.999551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:50.007605 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.007583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcdt2\" (UniqueName: \"kubernetes.io/projected/de127a70-49dc-4497-bc07-40fa5216e03b-kube-api-access-fcdt2\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:50.097886 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.097850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-wtmp\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098078 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.097893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-accelerators-collector-config\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098078 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.097997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-sys\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098078 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098078 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-wtmp\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098341 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlcmk\" (UniqueName: \"kubernetes.io/projected/f075f196-9caf-4281-83b3-edf93558d8f7-kube-api-access-rlcmk\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098341 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-sys\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098341 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f075f196-9caf-4281-83b3-edf93558d8f7-metrics-client-ca\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098341 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-textfile\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098341 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-tls\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098341 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-root\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098633 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f075f196-9caf-4281-83b3-edf93558d8f7-root\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098691 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-textfile\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.098775 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.098680 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-accelerators-collector-config\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.099331 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.099302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f075f196-9caf-4281-83b3-edf93558d8f7-metrics-client-ca\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.101045 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.101020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.101603 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.101583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f075f196-9caf-4281-83b3-edf93558d8f7-node-exporter-tls\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.111251 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.111229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlcmk\" (UniqueName: \"kubernetes.io/projected/f075f196-9caf-4281-83b3-edf93558d8f7-kube-api-access-rlcmk\") pod \"node-exporter-swxxb\" (UID: \"f075f196-9caf-4281-83b3-edf93558d8f7\") " pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.160386 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.160352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-swxxb" Apr 23 17:46:50.170875 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:46:50.170844 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf075f196_9caf_4281_83b3_edf93558d8f7.slice/crio-2b1210c0e4c4775a7405c7d3b8189f258cc342a32fef47dc1b16f4e9c75ce669 WatchSource:0}: Error finding container 2b1210c0e4c4775a7405c7d3b8189f258cc342a32fef47dc1b16f4e9c75ce669: Status 404 returned error can't find the container with id 2b1210c0e4c4775a7405c7d3b8189f258cc342a32fef47dc1b16f4e9c75ce669 Apr 23 17:46:50.501509 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.501424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:50.504226 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.504199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/de127a70-49dc-4497-bc07-40fa5216e03b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vn6j\" (UID: \"de127a70-49dc-4497-bc07-40fa5216e03b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:50.510911 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.510872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-swxxb" event={"ID":"f075f196-9caf-4281-83b3-edf93558d8f7","Type":"ContainerStarted","Data":"2b1210c0e4c4775a7405c7d3b8189f258cc342a32fef47dc1b16f4e9c75ce669"} Apr 23 17:46:50.720209 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.719770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" Apr 23 17:46:50.990560 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:50.990321 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vn6j"] Apr 23 17:46:50.993571 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:46:50.993527 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde127a70_49dc_4497_bc07_40fa5216e03b.slice/crio-a65c9acf6aec1547e450212936bb9091bc9fd21fec8e6d6698486100cface59d WatchSource:0}: Error finding container a65c9acf6aec1547e450212936bb9091bc9fd21fec8e6d6698486100cface59d: Status 404 returned error can't find the container with id a65c9acf6aec1547e450212936bb9091bc9fd21fec8e6d6698486100cface59d Apr 23 17:46:51.515890 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:51.515803 2576 generic.go:358] "Generic (PLEG): container finished" podID="f075f196-9caf-4281-83b3-edf93558d8f7" containerID="7911c3b04d861799f2024c93fadd10374edafb79201b1cef0b0e058ab606177a" exitCode=0 Apr 23 17:46:51.516085 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:51.515891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-swxxb" event={"ID":"f075f196-9caf-4281-83b3-edf93558d8f7","Type":"ContainerDied","Data":"7911c3b04d861799f2024c93fadd10374edafb79201b1cef0b0e058ab606177a"} Apr 23 17:46:51.517506 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:51.517472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" event={"ID":"de127a70-49dc-4497-bc07-40fa5216e03b","Type":"ContainerStarted","Data":"a65c9acf6aec1547e450212936bb9091bc9fd21fec8e6d6698486100cface59d"} Apr 23 17:46:52.491259 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.491229 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:46:52.522566 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.522529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-swxxb" event={"ID":"f075f196-9caf-4281-83b3-edf93558d8f7","Type":"ContainerStarted","Data":"e5ef434917efd4a35d6b33f9f42dbef7d968e91a3eab55c3557d7a7ea506e160"} Apr 23 17:46:52.774148 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.774064 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2"] Apr 23 17:46:52.777114 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.777091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:52.783474 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.783447 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 17:46:52.783578 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.783447 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 17:46:52.783854 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.783834 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 17:46:52.784096 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.784075 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 17:46:52.784249 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.784232 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ar9ojdrff34mg\"" Apr 23 17:46:52.784308 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.784274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 17:46:52.785384 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.785365 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-hrcvq\"" Apr 23 17:46:52.798454 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.798428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2"] Apr 23 17:46:52.926175 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.926139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:52.926410 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.926210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-tls\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:52.926410 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.926242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc3bd63-d168-4a51-b334-72c496741f86-metrics-client-ca\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:52.926410 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.926299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-grpc-tls\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:52.926410 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.926338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:52.926410 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.926373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:52.926410 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.926398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmvb6\" (UniqueName: \"kubernetes.io/projected/2cc3bd63-d168-4a51-b334-72c496741f86-kube-api-access-hmvb6\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:52.926760 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:52.926429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.027112 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.027017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.027112 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.027082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.027345 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.027144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-tls\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.027345 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.027174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc3bd63-d168-4a51-b334-72c496741f86-metrics-client-ca\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.027345 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.027214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-grpc-tls\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.027345 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.027251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.027345 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.027287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.027600 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.027336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmvb6\" (UniqueName: \"kubernetes.io/projected/2cc3bd63-d168-4a51-b334-72c496741f86-kube-api-access-hmvb6\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.028607 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.028488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc3bd63-d168-4a51-b334-72c496741f86-metrics-client-ca\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.031752 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.031239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.031752 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.031269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.032566 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.032542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-grpc-tls\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.034398 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.034360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.035455 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.035433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-tls\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.036023 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.035998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2cc3bd63-d168-4a51-b334-72c496741f86-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.039049 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.039030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmvb6\" (UniqueName: \"kubernetes.io/projected/2cc3bd63-d168-4a51-b334-72c496741f86-kube-api-access-hmvb6\") pod \"thanos-querier-5cd7c4cf98-6qxq2\" (UID: \"2cc3bd63-d168-4a51-b334-72c496741f86\") " pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:53.087991 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:53.087956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:46:54.470035 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.469999 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56c8457455-7bjzd"] Apr 23 17:46:54.473128 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.473104 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.479979 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.479954 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 17:46:54.480122 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.479980 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:46:54.480122 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.479956 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 17:46:54.480530 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.480501 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 17:46:54.480821 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.480676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-z7c8l\"" Apr 23 17:46:54.480821 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.480737 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-fj8lunb0ff5s5\"" Apr 23 17:46:54.513013 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.512982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56c8457455-7bjzd"] Apr 23 17:46:54.539345 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.539308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d418b34f-88f5-4b79-88db-a4f7534d1469-metrics-server-audit-profiles\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.539517 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.539360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d418b34f-88f5-4b79-88db-a4f7534d1469-audit-log\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.539517 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.539437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d418b34f-88f5-4b79-88db-a4f7534d1469-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.539517 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.539466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-client-ca-bundle\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.539517 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.539493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-secret-metrics-server-client-certs\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.539703 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.539587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9h4\" (UniqueName: \"kubernetes.io/projected/d418b34f-88f5-4b79-88db-a4f7534d1469-kube-api-access-mk9h4\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.539703 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.539623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-secret-metrics-server-tls\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.640437 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.640396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d418b34f-88f5-4b79-88db-a4f7534d1469-metrics-server-audit-profiles\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.640612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.640448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d418b34f-88f5-4b79-88db-a4f7534d1469-audit-log\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.640612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.640492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d418b34f-88f5-4b79-88db-a4f7534d1469-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.640612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.640516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-client-ca-bundle\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.640612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.640537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-secret-metrics-server-client-certs\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.640612 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.640589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9h4\" (UniqueName: \"kubernetes.io/projected/d418b34f-88f5-4b79-88db-a4f7534d1469-kube-api-access-mk9h4\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.640898 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.640624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-secret-metrics-server-tls\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.641303 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.641254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d418b34f-88f5-4b79-88db-a4f7534d1469-audit-log\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.641685 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.641621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d418b34f-88f5-4b79-88db-a4f7534d1469-metrics-server-audit-profiles\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.641824 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.641772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d418b34f-88f5-4b79-88db-a4f7534d1469-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.643493 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.643446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-secret-metrics-server-tls\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.643640 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.643605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-client-ca-bundle\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.643640 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.643605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d418b34f-88f5-4b79-88db-a4f7534d1469-secret-metrics-server-client-certs\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.662156 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.662128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9h4\" (UniqueName: \"kubernetes.io/projected/d418b34f-88f5-4b79-88db-a4f7534d1469-kube-api-access-mk9h4\") pod \"metrics-server-56c8457455-7bjzd\" (UID: \"d418b34f-88f5-4b79-88db-a4f7534d1469\") " pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:54.784835 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:54.784759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:46:59.008794 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.008743 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2"] Apr 23 17:46:59.013169 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:46:59.013103 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cc3bd63_d168_4a51_b334_72c496741f86.slice/crio-9cbdffcf4932b97de9df36700d4f358dc03b2e483ac2d8a240c2fa57bf1ee7ef WatchSource:0}: Error finding container 9cbdffcf4932b97de9df36700d4f358dc03b2e483ac2d8a240c2fa57bf1ee7ef: Status 404 returned error can't find the container with id 9cbdffcf4932b97de9df36700d4f358dc03b2e483ac2d8a240c2fa57bf1ee7ef Apr 23 17:46:59.029998 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.029969 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56c8457455-7bjzd"] Apr 23 17:46:59.033153 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:46:59.033123 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd418b34f_88f5_4b79_88db_a4f7534d1469.slice/crio-76ce6b99fbb70bae07bb09b4af7a793dfc5dd3fff90c2201e80fb31731268cdb WatchSource:0}: Error finding container 76ce6b99fbb70bae07bb09b4af7a793dfc5dd3fff90c2201e80fb31731268cdb: Status 404 returned error can't find the container with id 76ce6b99fbb70bae07bb09b4af7a793dfc5dd3fff90c2201e80fb31731268cdb Apr 23 17:46:59.545940 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.545886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" event={"ID":"d418b34f-88f5-4b79-88db-a4f7534d1469","Type":"ContainerStarted","Data":"76ce6b99fbb70bae07bb09b4af7a793dfc5dd3fff90c2201e80fb31731268cdb"} Apr 23 17:46:59.549093 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.549064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-swxxb" event={"ID":"f075f196-9caf-4281-83b3-edf93558d8f7","Type":"ContainerStarted","Data":"d0107f5d208a9340e400130c0ef95e30ebb2cd85ea6b059a8ae1818a151844ba"} Apr 23 17:46:59.552490 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.552284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" event={"ID":"de127a70-49dc-4497-bc07-40fa5216e03b","Type":"ContainerStarted","Data":"b3a35e661bf0b67e03cd713370b0487e6acea5129198433c00506369aec08eec"} Apr 23 17:46:59.552490 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.552316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" event={"ID":"de127a70-49dc-4497-bc07-40fa5216e03b","Type":"ContainerStarted","Data":"0a3231829ac5b7799d9ca3a971d16ffda1bd04bef8e24e2b56bbfc3841e4a9d5"} Apr 23 17:46:59.552490 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.552330 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" event={"ID":"de127a70-49dc-4497-bc07-40fa5216e03b","Type":"ContainerStarted","Data":"bc5df6a5275058e2ea2ce71a01fd30be55c911194a10b8e208e9bacf05266cdf"} Apr 23 17:46:59.555499 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.555447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-b85zj" event={"ID":"e4ed6eb2-5d97-4853-bf31-2c8cae882d07","Type":"ContainerStarted","Data":"ac9bac62b56a777fde260d3468232a4eaa786ad394694aa839ee495348687ee0"} Apr 23 17:46:59.555786 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.555620 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-b85zj" Apr 23 17:46:59.560093 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.559176 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" event={"ID":"2cc3bd63-d168-4a51-b334-72c496741f86","Type":"ContainerStarted","Data":"9cbdffcf4932b97de9df36700d4f358dc03b2e483ac2d8a240c2fa57bf1ee7ef"} Apr 23 17:46:59.568193 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.568153 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-b85zj" Apr 23 17:46:59.589978 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.589846 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-swxxb" podStartSLOduration=9.911559798999999 podStartE2EDuration="10.589820945s" podCreationTimestamp="2026-04-23 17:46:49 +0000 UTC" firstStartedPulling="2026-04-23 17:46:50.172883377 +0000 UTC m=+318.102895995" lastFinishedPulling="2026-04-23 17:46:50.851144527 +0000 UTC m=+318.781157141" observedRunningTime="2026-04-23 17:46:59.589352465 +0000 UTC m=+327.519365102" watchObservedRunningTime="2026-04-23 17:46:59.589820945 +0000 UTC m=+327.519833583" Apr 23 17:46:59.708997 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.708826 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-b85zj" podStartSLOduration=1.8167967 podStartE2EDuration="17.708806013s" podCreationTimestamp="2026-04-23 17:46:42 +0000 UTC" firstStartedPulling="2026-04-23 17:46:42.959027011 +0000 UTC m=+310.889039624" lastFinishedPulling="2026-04-23 17:46:58.851036312 +0000 UTC m=+326.781048937" observedRunningTime="2026-04-23 17:46:59.627529879 +0000 UTC m=+327.557542594" watchObservedRunningTime="2026-04-23 17:46:59.708806013 +0000 UTC m=+327.638818652" Apr 23 17:46:59.709581 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:46:59.709300 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vn6j" podStartSLOduration=2.902899307 podStartE2EDuration="10.709289822s" podCreationTimestamp="2026-04-23 17:46:49 +0000 UTC" firstStartedPulling="2026-04-23 17:46:50.996131284 +0000 UTC m=+318.926143899" lastFinishedPulling="2026-04-23 17:46:58.802521795 +0000 UTC m=+326.732534414" observedRunningTime="2026-04-23 17:46:59.706093298 +0000 UTC m=+327.636105935" watchObservedRunningTime="2026-04-23 17:46:59.709289822 +0000 UTC m=+327.639302459" Apr 23 17:47:01.569221 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:01.569186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" event={"ID":"2cc3bd63-d168-4a51-b334-72c496741f86","Type":"ContainerStarted","Data":"bbadf41ce42eb8593aaad49ca89fcf41d2d572f7622a6a4d4ca20198f4671451"} Apr 23 17:47:01.571259 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:01.571230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" event={"ID":"d418b34f-88f5-4b79-88db-a4f7534d1469","Type":"ContainerStarted","Data":"3f47bf9f287678c23b334d443d1f9a70a4dd3a1d853e05c0ab15c0ae6150e686"} Apr 23 17:47:01.615105 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:01.614841 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" podStartSLOduration=5.214407823 podStartE2EDuration="7.614820978s" podCreationTimestamp="2026-04-23 17:46:54 +0000 UTC" firstStartedPulling="2026-04-23 17:46:59.035261237 +0000 UTC m=+326.965273864" lastFinishedPulling="2026-04-23 17:47:01.435674398 +0000 UTC m=+329.365687019" observedRunningTime="2026-04-23 17:47:01.612116472 +0000 UTC m=+329.542129107" watchObservedRunningTime="2026-04-23 17:47:01.614820978 +0000 UTC m=+329.544833618" Apr 23 17:47:02.578854 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:02.578773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" event={"ID":"2cc3bd63-d168-4a51-b334-72c496741f86","Type":"ContainerStarted","Data":"64bbdb0532f7da371dfe4b6d8990a7d61f54ddf7adf63d1aa7cbdf3d4ab822a1"} Apr 23 17:47:02.578854 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:02.578835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" event={"ID":"2cc3bd63-d168-4a51-b334-72c496741f86","Type":"ContainerStarted","Data":"3c51e67d5e5a6a792fb2440ca62048d7077dcd9d7b813a8ea2613f032f7f9a92"} Apr 23 17:47:03.588219 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:03.588178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" event={"ID":"2cc3bd63-d168-4a51-b334-72c496741f86","Type":"ContainerStarted","Data":"47cf92bd54249ae267f3f44122f2c11d8c054389ffaad89df5ad59d3ea767af2"} Apr 23 17:47:03.588219 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:03.588219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" event={"ID":"2cc3bd63-d168-4a51-b334-72c496741f86","Type":"ContainerStarted","Data":"2dd0d9e7835c30b4bd4dfe013bb55f03d2989f9e2b6d99650f78808299ffd8cd"} Apr 23 17:47:03.588835 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:03.588232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" event={"ID":"2cc3bd63-d168-4a51-b334-72c496741f86","Type":"ContainerStarted","Data":"4053faced3d1d4f128c419c6e96501948df30a732b37e3a8801183b94d22676b"} Apr 23 17:47:03.588835 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:03.588358 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:47:03.635575 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:03.635454 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" podStartSLOduration=7.594923651 podStartE2EDuration="11.635434421s" podCreationTimestamp="2026-04-23 17:46:52 +0000 UTC" firstStartedPulling="2026-04-23 17:46:59.018002754 +0000 UTC m=+326.948015382" lastFinishedPulling="2026-04-23 17:47:03.058513538 +0000 UTC m=+330.988526152" observedRunningTime="2026-04-23 17:47:03.633058043 +0000 UTC m=+331.563070678" watchObservedRunningTime="2026-04-23 17:47:03.635434421 +0000 UTC m=+331.565447058" Apr 23 17:47:05.496117 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:05.496084 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f99d9d489-gqwc4" Apr 23 17:47:07.508438 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.508389 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" containerID="cri-o://e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05" gracePeriod=30 Apr 23 17:47:07.796034 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.796004 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:47:07.970618 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.970587 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") pod \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " Apr 23 17:47:07.970798 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.970662 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf2fj\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-kube-api-access-rf2fj\") pod \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " Apr 23 17:47:07.970798 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.970705 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-trusted-ca\") pod \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " Apr 23 17:47:07.970798 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.970770 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-certificates\") pod \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " Apr 23 17:47:07.970977 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.970802 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-image-registry-private-configuration\") pod \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " Apr 23 17:47:07.970977 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.970864 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-bound-sa-token\") pod \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " Apr 23 17:47:07.970977 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.970896 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-ca-trust-extracted\") pod \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " Apr 23 17:47:07.970977 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.970920 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-installation-pull-secrets\") pod \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\" (UID: \"0c7736d1-09e9-4b6a-a2e0-a5675df75c37\") " Apr 23 17:47:07.971285 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.971254 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0c7736d1-09e9-4b6a-a2e0-a5675df75c37" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:07.971357 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.971297 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0c7736d1-09e9-4b6a-a2e0-a5675df75c37" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:47:07.973320 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.973287 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0c7736d1-09e9-4b6a-a2e0-a5675df75c37" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:07.973606 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.973571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0c7736d1-09e9-4b6a-a2e0-a5675df75c37" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:07.973705 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.973608 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0c7736d1-09e9-4b6a-a2e0-a5675df75c37" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:07.973923 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.973892 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-kube-api-access-rf2fj" (OuterVolumeSpecName: "kube-api-access-rf2fj") pod "0c7736d1-09e9-4b6a-a2e0-a5675df75c37" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37"). InnerVolumeSpecName "kube-api-access-rf2fj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:47:07.974007 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.973984 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0c7736d1-09e9-4b6a-a2e0-a5675df75c37" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:47:07.982217 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:07.982190 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0c7736d1-09e9-4b6a-a2e0-a5675df75c37" (UID: "0c7736d1-09e9-4b6a-a2e0-a5675df75c37"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:47:08.072536 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.072500 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-certificates\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 17:47:08.072536 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.072533 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-image-registry-private-configuration\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 17:47:08.072794 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.072546 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-bound-sa-token\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 17:47:08.072794 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.072560 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-ca-trust-extracted\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 17:47:08.072794 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.072572 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-installation-pull-secrets\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 17:47:08.072794 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.072585 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-registry-tls\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 17:47:08.072794 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.072597 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rf2fj\" (UniqueName: \"kubernetes.io/projected/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-kube-api-access-rf2fj\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 17:47:08.072794 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.072611 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c7736d1-09e9-4b6a-a2e0-a5675df75c37-trusted-ca\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 17:47:08.607271 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.607209 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerID="e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05" exitCode=0 Apr 23 17:47:08.607754 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.607304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" event={"ID":"0c7736d1-09e9-4b6a-a2e0-a5675df75c37","Type":"ContainerDied","Data":"e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05"} Apr 23 17:47:08.607754 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.607310 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" Apr 23 17:47:08.607754 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.607335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-684c867d65-lhpfq" event={"ID":"0c7736d1-09e9-4b6a-a2e0-a5675df75c37","Type":"ContainerDied","Data":"078718bed934975d0bf936722f642afdaf84b91a5101fa311c084348deb7a8c0"} Apr 23 17:47:08.607754 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.607350 2576 scope.go:117] "RemoveContainer" containerID="e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05" Apr 23 17:47:08.617879 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.617855 2576 scope.go:117] "RemoveContainer" containerID="461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b" Apr 23 17:47:08.626412 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.626378 2576 scope.go:117] "RemoveContainer" containerID="e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05" Apr 23 17:47:08.626765 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:47:08.626700 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05\": container with ID starting with e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05 not found: ID does not exist" containerID="e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05" Apr 23 17:47:08.626864 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.626759 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05"} err="failed to get container status \"e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05\": rpc error: code = NotFound desc = could not find container \"e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05\": container with ID starting with e73d5a24c0e5bf10dc5524803bd741d74874939c4907d7ae69b51de8173f1c05 not found: ID does not exist" Apr 23 17:47:08.626864 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.626788 2576 scope.go:117] "RemoveContainer" containerID="461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b" Apr 23 17:47:08.627099 ip-10-0-138-68 kubenswrapper[2576]: E0423 17:47:08.627077 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b\": container with ID starting with 461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b not found: ID does not exist" containerID="461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b" Apr 23 17:47:08.627185 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.627110 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b"} err="failed to get container status \"461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b\": rpc error: code = NotFound desc = could not find container \"461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b\": container with ID starting with 461fcaeea42b73a645fed303e444aeb380e605962a77efaaff6118e608b88a4b not found: ID does not exist" Apr 23 17:47:08.652792 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.652754 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-684c867d65-lhpfq"] Apr 23 17:47:08.666451 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:08.666327 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-684c867d65-lhpfq"] Apr 23 17:47:09.598384 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:09.598358 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5cd7c4cf98-6qxq2" Apr 23 17:47:10.646870 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:10.646796 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" path="/var/lib/kubelet/pods/0c7736d1-09e9-4b6a-a2e0-a5675df75c37/volumes" Apr 23 17:47:14.785174 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:14.785133 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:47:14.785174 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:14.785183 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:47:34.790888 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:34.790859 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:47:34.794752 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:47:34.794709 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56c8457455-7bjzd" Apr 23 17:49:46.957099 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.957063 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv"] Apr 23 17:49:46.957587 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.957334 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 17:49:46.957587 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.957344 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 17:49:46.957587 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.957353 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 17:49:46.957587 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.957359 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 17:49:46.957587 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.957404 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 17:49:46.957587 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.957412 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 17:49:46.960114 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.960097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" Apr 23 17:49:46.963321 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.963300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-74k4b\"" Apr 23 17:49:46.963432 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.963300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 17:49:46.964228 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.964209 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 17:49:46.964336 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.964262 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 17:49:46.964336 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.964267 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 17:49:46.970676 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:46.970659 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv"] Apr 23 17:49:47.067881 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:47.067848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6bs\" (UniqueName: \"kubernetes.io/projected/67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b-kube-api-access-8x6bs\") pod \"managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv\" (UID: \"67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" Apr 23 17:49:47.068043 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:47.067896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv\" (UID: \"67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" Apr 23 17:49:47.169000 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:47.168969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x6bs\" (UniqueName: \"kubernetes.io/projected/67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b-kube-api-access-8x6bs\") pod \"managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv\" (UID: \"67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" Apr 23 17:49:47.169175 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:47.169016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv\" (UID: \"67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" Apr 23 17:49:47.171418 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:47.171397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv\" (UID: \"67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" Apr 23 17:49:47.202771 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:47.202738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x6bs\" (UniqueName: \"kubernetes.io/projected/67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b-kube-api-access-8x6bs\") pod \"managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv\" (UID: \"67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" Apr 23 17:49:47.276678 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:47.276598 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" Apr 23 17:49:47.394847 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:47.394815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv"] Apr 23 17:49:47.398077 ip-10-0-138-68 kubenswrapper[2576]: W0423 17:49:47.398042 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67a7ccbd_baa1_4dbd_8dcf_d27a8e734e1b.slice/crio-213c8cbd5025c1878a98e1b9d60dceeed0d1bcd105dfc7cfaaa1bd23eddc2313 WatchSource:0}: Error finding container 213c8cbd5025c1878a98e1b9d60dceeed0d1bcd105dfc7cfaaa1bd23eddc2313: Status 404 returned error can't find the container with id 213c8cbd5025c1878a98e1b9d60dceeed0d1bcd105dfc7cfaaa1bd23eddc2313 Apr 23 17:49:48.047441 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:48.047343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" event={"ID":"67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b","Type":"ContainerStarted","Data":"213c8cbd5025c1878a98e1b9d60dceeed0d1bcd105dfc7cfaaa1bd23eddc2313"} Apr 23 17:49:50.054669 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:49:50.054634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" event={"ID":"67a7ccbd-baa1-4dbd-8dcf-d27a8e734e1b","Type":"ContainerStarted","Data":"a0eea3126ca8384f347092f092463d966a6736b9ebb17cfcf4a9b69372f494b4"} Apr 23 17:51:32.537993 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:51:32.537964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:51:32.539054 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:51:32.539033 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:51:32.541806 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:51:32.541786 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:51:32.542431 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:51:32.542412 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:56:32.556781 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:56:32.556693 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:56:32.559417 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:56:32.559396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 17:56:32.561013 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:56:32.560994 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 17:56:32.563233 ip-10-0-138-68 kubenswrapper[2576]: I0423 17:56:32.563215 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:01:32.578828 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:01:32.578795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:01:32.580844 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:01:32.580820 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:01:32.582294 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:01:32.582273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:01:32.584151 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:01:32.584132 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:06:32.597456 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:06:32.597429 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:06:32.599905 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:06:32.599626 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:06:32.600945 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:06:32.600925 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:06:32.603195 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:06:32.603178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:11:32.615898 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:11:32.615778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:11:32.622863 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:11:32.619129 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:11:32.622863 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:11:32.619516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:11:32.623026 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:11:32.622902 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:15:05.331158 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.331092 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59fcdfc85c-5dbwv" podStartSLOduration=1516.912079841 podStartE2EDuration="25m19.3310756s" podCreationTimestamp="2026-04-23 17:49:46 +0000 UTC" firstStartedPulling="2026-04-23 17:49:47.400111634 +0000 UTC m=+495.330124249" lastFinishedPulling="2026-04-23 17:49:49.819107393 +0000 UTC m=+497.749120008" observedRunningTime="2026-04-23 17:49:50.072632515 +0000 UTC m=+498.002645148" watchObservedRunningTime="2026-04-23 18:15:05.3310756 +0000 UTC m=+2013.261088244" Apr 23 18:15:05.331658 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.331599 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg"] Apr 23 18:15:05.331953 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.331939 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 18:15:05.331997 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.331954 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 18:15:05.332030 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.332005 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c7736d1-09e9-4b6a-a2e0-a5675df75c37" containerName="registry" Apr 23 18:15:05.335089 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.335073 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.337979 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.337958 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 18:15:05.339264 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.339246 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b7cnz\"" Apr 23 18:15:05.339334 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.339263 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 18:15:05.352183 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.352164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg"] Apr 23 18:15:05.410251 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.410219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvw8m\" (UniqueName: \"kubernetes.io/projected/2a838a59-967e-43bd-9310-d2516e4b9e39-kube-api-access-kvw8m\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.410426 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.410274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.410426 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.410340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.511674 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.511637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvw8m\" (UniqueName: \"kubernetes.io/projected/2a838a59-967e-43bd-9310-d2516e4b9e39-kube-api-access-kvw8m\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.511881 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.511706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.511881 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.511773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.512257 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.512233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.512293 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.512243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.521146 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.521121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvw8m\" (UniqueName: \"kubernetes.io/projected/2a838a59-967e-43bd-9310-d2516e4b9e39-kube-api-access-kvw8m\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.644040 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.643929 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:05.763275 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.763241 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg"] Apr 23 18:15:05.767584 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:15:05.767555 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a838a59_967e_43bd_9310_d2516e4b9e39.slice/crio-e11386ab38d1235c1e619a7eaf3084b2201f8908824238d6c00dbffae8e54579 WatchSource:0}: Error finding container e11386ab38d1235c1e619a7eaf3084b2201f8908824238d6c00dbffae8e54579: Status 404 returned error can't find the container with id e11386ab38d1235c1e619a7eaf3084b2201f8908824238d6c00dbffae8e54579 Apr 23 18:15:05.769858 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:05.769843 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:15:06.261114 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:06.261074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" event={"ID":"2a838a59-967e-43bd-9310-d2516e4b9e39","Type":"ContainerStarted","Data":"e11386ab38d1235c1e619a7eaf3084b2201f8908824238d6c00dbffae8e54579"} Apr 23 18:15:11.276876 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:11.276837 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerID="b4bf71f627aab1a5b2c8e982fc6548f9e9a6dd31d29ac41fd3686e057ba892ac" exitCode=0 Apr 23 18:15:11.277448 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:11.276921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" event={"ID":"2a838a59-967e-43bd-9310-d2516e4b9e39","Type":"ContainerDied","Data":"b4bf71f627aab1a5b2c8e982fc6548f9e9a6dd31d29ac41fd3686e057ba892ac"} Apr 23 18:15:14.287987 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:14.287948 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerID="db39e98a3546af7e2454befd18989407ec0d200483637c82f7f12450f349f6a4" exitCode=0 Apr 23 18:15:14.288376 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:14.288037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" event={"ID":"2a838a59-967e-43bd-9310-d2516e4b9e39","Type":"ContainerDied","Data":"db39e98a3546af7e2454befd18989407ec0d200483637c82f7f12450f349f6a4"} Apr 23 18:15:21.309496 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:21.309456 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerID="63630aadc36e6a89ddd6c852a48796416492ea732bc009f5136a73ad936c444a" exitCode=0 Apr 23 18:15:21.309889 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:21.309580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" event={"ID":"2a838a59-967e-43bd-9310-d2516e4b9e39","Type":"ContainerDied","Data":"63630aadc36e6a89ddd6c852a48796416492ea732bc009f5136a73ad936c444a"} Apr 23 18:15:22.427506 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.427477 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:22.564050 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.563947 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-util\") pod \"2a838a59-967e-43bd-9310-d2516e4b9e39\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " Apr 23 18:15:22.564050 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.564022 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvw8m\" (UniqueName: \"kubernetes.io/projected/2a838a59-967e-43bd-9310-d2516e4b9e39-kube-api-access-kvw8m\") pod \"2a838a59-967e-43bd-9310-d2516e4b9e39\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " Apr 23 18:15:22.564285 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.564083 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-bundle\") pod \"2a838a59-967e-43bd-9310-d2516e4b9e39\" (UID: \"2a838a59-967e-43bd-9310-d2516e4b9e39\") " Apr 23 18:15:22.564624 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.564597 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-bundle" (OuterVolumeSpecName: "bundle") pod "2a838a59-967e-43bd-9310-d2516e4b9e39" (UID: "2a838a59-967e-43bd-9310-d2516e4b9e39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:15:22.566222 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.566196 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a838a59-967e-43bd-9310-d2516e4b9e39-kube-api-access-kvw8m" (OuterVolumeSpecName: "kube-api-access-kvw8m") pod "2a838a59-967e-43bd-9310-d2516e4b9e39" (UID: "2a838a59-967e-43bd-9310-d2516e4b9e39"). InnerVolumeSpecName "kube-api-access-kvw8m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:15:22.569676 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.569640 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-util" (OuterVolumeSpecName: "util") pod "2a838a59-967e-43bd-9310-d2516e4b9e39" (UID: "2a838a59-967e-43bd-9310-d2516e4b9e39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:15:22.664840 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.664814 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvw8m\" (UniqueName: \"kubernetes.io/projected/2a838a59-967e-43bd-9310-d2516e4b9e39-kube-api-access-kvw8m\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:15:22.664840 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.664840 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:15:22.664998 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:22.664854 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a838a59-967e-43bd-9310-d2516e4b9e39-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:15:23.316985 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:23.316949 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" event={"ID":"2a838a59-967e-43bd-9310-d2516e4b9e39","Type":"ContainerDied","Data":"e11386ab38d1235c1e619a7eaf3084b2201f8908824238d6c00dbffae8e54579"} Apr 23 18:15:23.316985 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:23.316981 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e11386ab38d1235c1e619a7eaf3084b2201f8908824238d6c00dbffae8e54579" Apr 23 18:15:23.317231 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:23.317034 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfflkg" Apr 23 18:15:27.511275 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.511241 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9"] Apr 23 18:15:27.511747 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.511539 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerName="util" Apr 23 18:15:27.511747 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.511552 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerName="util" Apr 23 18:15:27.511747 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.511563 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerName="extract" Apr 23 18:15:27.511747 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.511569 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerName="extract" Apr 23 18:15:27.511747 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.511588 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerName="pull" Apr 23 18:15:27.511747 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.511595 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerName="pull" Apr 23 18:15:27.511747 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.511646 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a838a59-967e-43bd-9310-d2516e4b9e39" containerName="extract" Apr 23 18:15:27.560773 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.560712 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9"] Apr 23 18:15:27.560931 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.560874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:27.564312 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.564266 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 18:15:27.564312 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.564316 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 18:15:27.564530 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.564270 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-wqcpw\"" Apr 23 18:15:27.564530 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.564279 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 18:15:27.704759 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.704696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/257c6ed0-8780-4baa-a4c0-c1a6fcd4606e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tklc9\" (UID: \"257c6ed0-8780-4baa-a4c0-c1a6fcd4606e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:27.704759 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.704764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbczb\" (UniqueName: \"kubernetes.io/projected/257c6ed0-8780-4baa-a4c0-c1a6fcd4606e-kube-api-access-qbczb\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tklc9\" (UID: \"257c6ed0-8780-4baa-a4c0-c1a6fcd4606e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:27.806246 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.806155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/257c6ed0-8780-4baa-a4c0-c1a6fcd4606e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tklc9\" (UID: \"257c6ed0-8780-4baa-a4c0-c1a6fcd4606e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:27.806246 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.806220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbczb\" (UniqueName: \"kubernetes.io/projected/257c6ed0-8780-4baa-a4c0-c1a6fcd4606e-kube-api-access-qbczb\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tklc9\" (UID: \"257c6ed0-8780-4baa-a4c0-c1a6fcd4606e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:27.808519 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.808488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/257c6ed0-8780-4baa-a4c0-c1a6fcd4606e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tklc9\" (UID: \"257c6ed0-8780-4baa-a4c0-c1a6fcd4606e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:27.816229 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.816203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbczb\" (UniqueName: \"kubernetes.io/projected/257c6ed0-8780-4baa-a4c0-c1a6fcd4606e-kube-api-access-qbczb\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tklc9\" (UID: \"257c6ed0-8780-4baa-a4c0-c1a6fcd4606e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:27.871235 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:27.871196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:28.017055 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:28.017020 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9"] Apr 23 18:15:28.020702 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:15:28.020671 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod257c6ed0_8780_4baa_a4c0_c1a6fcd4606e.slice/crio-6a354c888730f9410bd6969a2b0d0b58dc6dde59940eb74597a4dd9bf3939303 WatchSource:0}: Error finding container 6a354c888730f9410bd6969a2b0d0b58dc6dde59940eb74597a4dd9bf3939303: Status 404 returned error can't find the container with id 6a354c888730f9410bd6969a2b0d0b58dc6dde59940eb74597a4dd9bf3939303 Apr 23 18:15:28.336429 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:28.336392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" event={"ID":"257c6ed0-8780-4baa-a4c0-c1a6fcd4606e","Type":"ContainerStarted","Data":"6a354c888730f9410bd6969a2b0d0b58dc6dde59940eb74597a4dd9bf3939303"} Apr 23 18:15:32.352945 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.352910 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" event={"ID":"257c6ed0-8780-4baa-a4c0-c1a6fcd4606e","Type":"ContainerStarted","Data":"4052ed5e3f19be2c5b26fa845600d7c48be94c715e6872b1776da809890c66da"} Apr 23 18:15:32.353303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.353026 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:32.379413 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.379367 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" podStartSLOduration=1.656582083 podStartE2EDuration="5.379351533s" podCreationTimestamp="2026-04-23 18:15:27 +0000 UTC" firstStartedPulling="2026-04-23 18:15:28.022415302 +0000 UTC m=+2035.952427919" lastFinishedPulling="2026-04-23 18:15:31.745184752 +0000 UTC m=+2039.675197369" observedRunningTime="2026-04-23 18:15:32.377419054 +0000 UTC m=+2040.307431691" watchObservedRunningTime="2026-04-23 18:15:32.379351533 +0000 UTC m=+2040.309364168" Apr 23 18:15:32.623694 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.623616 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr"] Apr 23 18:15:32.640174 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.640145 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.643389 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.643367 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 18:15:32.643757 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.643642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-6clmg\"" Apr 23 18:15:32.643757 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.643654 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 18:15:32.651199 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.651163 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr"] Apr 23 18:15:32.755479 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.755444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrfnr\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-kube-api-access-mrfnr\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.755708 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.755684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.755824 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.755806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/37aee467-a634-4144-b2de-bb58e61dbbf9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.856659 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.856626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/37aee467-a634-4144-b2de-bb58e61dbbf9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.856858 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.856766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrfnr\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-kube-api-access-mrfnr\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.856858 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.856820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.856993 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:32.856972 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 18:15:32.857038 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:32.856999 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 18:15:32.857038 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:32.857024 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr: references non-existent secret key: tls.crt Apr 23 18:15:32.857108 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.857052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/37aee467-a634-4144-b2de-bb58e61dbbf9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.857108 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:32.857084 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates podName:37aee467-a634-4144-b2de-bb58e61dbbf9 nodeName:}" failed. No retries permitted until 2026-04-23 18:15:33.357065685 +0000 UTC m=+2041.287078299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates") pod "keda-metrics-apiserver-7c9f485588-ff4wr" (UID: "37aee467-a634-4144-b2de-bb58e61dbbf9") : references non-existent secret key: tls.crt Apr 23 18:15:32.868367 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.868338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrfnr\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-kube-api-access-mrfnr\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:32.972054 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.971970 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-x8vx4"] Apr 23 18:15:32.990646 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.990620 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-x8vx4"] Apr 23 18:15:32.990807 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.990747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:32.993514 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:32.993489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 18:15:33.158874 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.158844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/52e60026-15ca-42ce-a066-175d7e754166-certificates\") pod \"keda-admission-cf49989db-x8vx4\" (UID: \"52e60026-15ca-42ce-a066-175d7e754166\") " pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:33.159039 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.158894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvj6l\" (UniqueName: \"kubernetes.io/projected/52e60026-15ca-42ce-a066-175d7e754166-kube-api-access-vvj6l\") pod \"keda-admission-cf49989db-x8vx4\" (UID: \"52e60026-15ca-42ce-a066-175d7e754166\") " pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:33.259897 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.259817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/52e60026-15ca-42ce-a066-175d7e754166-certificates\") pod \"keda-admission-cf49989db-x8vx4\" (UID: \"52e60026-15ca-42ce-a066-175d7e754166\") " pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:33.259897 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.259868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvj6l\" (UniqueName: \"kubernetes.io/projected/52e60026-15ca-42ce-a066-175d7e754166-kube-api-access-vvj6l\") pod \"keda-admission-cf49989db-x8vx4\" (UID: \"52e60026-15ca-42ce-a066-175d7e754166\") " pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:33.262309 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.262285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/52e60026-15ca-42ce-a066-175d7e754166-certificates\") pod \"keda-admission-cf49989db-x8vx4\" (UID: \"52e60026-15ca-42ce-a066-175d7e754166\") " pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:33.269112 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.269090 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvj6l\" (UniqueName: \"kubernetes.io/projected/52e60026-15ca-42ce-a066-175d7e754166-kube-api-access-vvj6l\") pod \"keda-admission-cf49989db-x8vx4\" (UID: \"52e60026-15ca-42ce-a066-175d7e754166\") " pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:33.301520 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.301493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:33.360986 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.360954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:33.361369 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:33.361116 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 18:15:33.361369 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:33.361140 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 18:15:33.361369 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:33.361164 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr: references non-existent secret key: tls.crt Apr 23 18:15:33.361369 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:33.361262 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates podName:37aee467-a634-4144-b2de-bb58e61dbbf9 nodeName:}" failed. No retries permitted until 2026-04-23 18:15:34.361242396 +0000 UTC m=+2042.291255028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates") pod "keda-metrics-apiserver-7c9f485588-ff4wr" (UID: "37aee467-a634-4144-b2de-bb58e61dbbf9") : references non-existent secret key: tls.crt Apr 23 18:15:33.425815 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:33.425774 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-x8vx4"] Apr 23 18:15:33.428508 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:15:33.428478 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e60026_15ca_42ce_a066_175d7e754166.slice/crio-235254e4347f1715742e1553efabbd41dc27ee96723a5fb7a1600060ab35d9f9 WatchSource:0}: Error finding container 235254e4347f1715742e1553efabbd41dc27ee96723a5fb7a1600060ab35d9f9: Status 404 returned error can't find the container with id 235254e4347f1715742e1553efabbd41dc27ee96723a5fb7a1600060ab35d9f9 Apr 23 18:15:34.361864 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:34.361827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-x8vx4" event={"ID":"52e60026-15ca-42ce-a066-175d7e754166","Type":"ContainerStarted","Data":"235254e4347f1715742e1553efabbd41dc27ee96723a5fb7a1600060ab35d9f9"} Apr 23 18:15:34.371165 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:34.371134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:34.371307 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:34.371252 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 18:15:34.371307 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:34.371263 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 18:15:34.371307 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:34.371280 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr: references non-existent secret key: tls.crt Apr 23 18:15:34.371419 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:15:34.371325 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates podName:37aee467-a634-4144-b2de-bb58e61dbbf9 nodeName:}" failed. No retries permitted until 2026-04-23 18:15:36.371311951 +0000 UTC m=+2044.301324570 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates") pod "keda-metrics-apiserver-7c9f485588-ff4wr" (UID: "37aee467-a634-4144-b2de-bb58e61dbbf9") : references non-existent secret key: tls.crt Apr 23 18:15:35.369787 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:35.369744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-x8vx4" event={"ID":"52e60026-15ca-42ce-a066-175d7e754166","Type":"ContainerStarted","Data":"66a2f8b05d0ffd337b0b3cc0fdee6bf9889d341874bef97b77f0ebfc95e4ae7a"} Apr 23 18:15:35.370233 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:35.369858 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:15:35.388734 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:35.388670 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-x8vx4" podStartSLOduration=1.78037861 podStartE2EDuration="3.388655385s" podCreationTimestamp="2026-04-23 18:15:32 +0000 UTC" firstStartedPulling="2026-04-23 18:15:33.429897521 +0000 UTC m=+2041.359910135" lastFinishedPulling="2026-04-23 18:15:35.038174295 +0000 UTC m=+2042.968186910" observedRunningTime="2026-04-23 18:15:35.387929916 +0000 UTC m=+2043.317942565" watchObservedRunningTime="2026-04-23 18:15:35.388655385 +0000 UTC m=+2043.318668021" Apr 23 18:15:36.388001 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:36.387967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:36.390629 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:36.390603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/37aee467-a634-4144-b2de-bb58e61dbbf9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ff4wr\" (UID: \"37aee467-a634-4144-b2de-bb58e61dbbf9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:36.551558 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:36.551515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:36.671040 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:36.671013 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr"] Apr 23 18:15:36.673489 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:15:36.673462 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37aee467_a634_4144_b2de_bb58e61dbbf9.slice/crio-0b75ceb99cbb9cb2dbabf4fbf2b9e20fb6633a7e4560cf73a70f6a2f1837a54b WatchSource:0}: Error finding container 0b75ceb99cbb9cb2dbabf4fbf2b9e20fb6633a7e4560cf73a70f6a2f1837a54b: Status 404 returned error can't find the container with id 0b75ceb99cbb9cb2dbabf4fbf2b9e20fb6633a7e4560cf73a70f6a2f1837a54b Apr 23 18:15:37.378314 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:37.378268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" event={"ID":"37aee467-a634-4144-b2de-bb58e61dbbf9","Type":"ContainerStarted","Data":"0b75ceb99cbb9cb2dbabf4fbf2b9e20fb6633a7e4560cf73a70f6a2f1837a54b"} Apr 23 18:15:39.385941 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:39.385905 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" event={"ID":"37aee467-a634-4144-b2de-bb58e61dbbf9","Type":"ContainerStarted","Data":"cc311ee170203583103925241371bb439b41ebaf87378edd6b2523a38e5af90d"} Apr 23 18:15:39.386353 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:39.385969 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:39.429093 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:39.429040 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" podStartSLOduration=5.290010127 podStartE2EDuration="7.429026055s" podCreationTimestamp="2026-04-23 18:15:32 +0000 UTC" firstStartedPulling="2026-04-23 18:15:36.674785784 +0000 UTC m=+2044.604798401" lastFinishedPulling="2026-04-23 18:15:38.813801712 +0000 UTC m=+2046.743814329" observedRunningTime="2026-04-23 18:15:39.427977612 +0000 UTC m=+2047.357990260" watchObservedRunningTime="2026-04-23 18:15:39.429026055 +0000 UTC m=+2047.359038700" Apr 23 18:15:50.394463 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:50.394379 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ff4wr" Apr 23 18:15:53.360009 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:53.359979 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tklc9" Apr 23 18:15:56.375805 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:15:56.375775 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-x8vx4" Apr 23 18:16:25.932496 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:25.932460 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7"] Apr 23 18:16:25.937813 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:25.937791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:25.940805 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:25.940782 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 18:16:25.940911 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:25.940812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b7cnz\"" Apr 23 18:16:25.940911 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:25.940782 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 18:16:25.945273 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:25.945247 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7"] Apr 23 18:16:26.077652 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.077620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.077846 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.077663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.077846 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.077692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chs7k\" (UniqueName: \"kubernetes.io/projected/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-kube-api-access-chs7k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.178950 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.178918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.179115 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.178962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.179115 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.179001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chs7k\" (UniqueName: \"kubernetes.io/projected/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-kube-api-access-chs7k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.179323 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.179299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.179411 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.179390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.189108 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.189055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chs7k\" (UniqueName: \"kubernetes.io/projected/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-kube-api-access-chs7k\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.247684 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.247628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:26.372610 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.372584 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7"] Apr 23 18:16:26.374942 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:16:26.374915 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955a9868_3cb7_4d5a_9d7b_95d0d09d0a93.slice/crio-b717769c3eb2a326d2b97fb1f909e836b190084d2dc3f5180e28e5a241a6af66 WatchSource:0}: Error finding container b717769c3eb2a326d2b97fb1f909e836b190084d2dc3f5180e28e5a241a6af66: Status 404 returned error can't find the container with id b717769c3eb2a326d2b97fb1f909e836b190084d2dc3f5180e28e5a241a6af66 Apr 23 18:16:26.534994 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.534955 2576 generic.go:358] "Generic (PLEG): container finished" podID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerID="32059dd63622e352ec8861c76b22efc8665134cd93ed4ddb04ece88279cf0c2d" exitCode=0 Apr 23 18:16:26.535157 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.535010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" event={"ID":"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93","Type":"ContainerDied","Data":"32059dd63622e352ec8861c76b22efc8665134cd93ed4ddb04ece88279cf0c2d"} Apr 23 18:16:26.535157 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:26.535038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" event={"ID":"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93","Type":"ContainerStarted","Data":"b717769c3eb2a326d2b97fb1f909e836b190084d2dc3f5180e28e5a241a6af66"} Apr 23 18:16:29.546185 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:29.546142 2576 generic.go:358] "Generic (PLEG): container finished" podID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerID="073a07628051b126e8ccb01b0393312aed25d24d529998e0cf758e3bc99627bd" exitCode=0 Apr 23 18:16:29.546567 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:29.546230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" event={"ID":"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93","Type":"ContainerDied","Data":"073a07628051b126e8ccb01b0393312aed25d24d529998e0cf758e3bc99627bd"} Apr 23 18:16:30.551653 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:30.551615 2576 generic.go:358] "Generic (PLEG): container finished" podID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerID="37cf674bda28851a9e094f3ac259e8660c696a5d9153079f1fb20f8208af92fb" exitCode=0 Apr 23 18:16:30.552074 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:30.551706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" event={"ID":"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93","Type":"ContainerDied","Data":"37cf674bda28851a9e094f3ac259e8660c696a5d9153079f1fb20f8208af92fb"} Apr 23 18:16:31.676986 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.676964 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:31.827553 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.827523 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chs7k\" (UniqueName: \"kubernetes.io/projected/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-kube-api-access-chs7k\") pod \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " Apr 23 18:16:31.827733 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.827578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-bundle\") pod \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " Apr 23 18:16:31.827733 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.827637 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-util\") pod \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\" (UID: \"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93\") " Apr 23 18:16:31.828323 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.828299 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-bundle" (OuterVolumeSpecName: "bundle") pod "955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" (UID: "955a9868-3cb7-4d5a-9d7b-95d0d09d0a93"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:16:31.829555 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.829521 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-kube-api-access-chs7k" (OuterVolumeSpecName: "kube-api-access-chs7k") pod "955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" (UID: "955a9868-3cb7-4d5a-9d7b-95d0d09d0a93"). InnerVolumeSpecName "kube-api-access-chs7k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:16:31.832475 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.832453 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-util" (OuterVolumeSpecName: "util") pod "955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" (UID: "955a9868-3cb7-4d5a-9d7b-95d0d09d0a93"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:16:31.929141 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.929101 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-chs7k\" (UniqueName: \"kubernetes.io/projected/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-kube-api-access-chs7k\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:16:31.929141 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.929135 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:16:31.929141 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:31.929150 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955a9868-3cb7-4d5a-9d7b-95d0d09d0a93-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:16:32.559698 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:32.559656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" event={"ID":"955a9868-3cb7-4d5a-9d7b-95d0d09d0a93","Type":"ContainerDied","Data":"b717769c3eb2a326d2b97fb1f909e836b190084d2dc3f5180e28e5a241a6af66"} Apr 23 18:16:32.559698 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:32.559699 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b717769c3eb2a326d2b97fb1f909e836b190084d2dc3f5180e28e5a241a6af66" Apr 23 18:16:32.559925 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:32.559674 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddc9j7" Apr 23 18:16:32.639699 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:32.639590 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:16:32.658759 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:32.641409 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:16:32.658759 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:32.643412 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:16:32.658759 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:32.645577 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:16:48.077145 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.077104 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742"] Apr 23 18:16:48.077612 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.077432 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerName="pull" Apr 23 18:16:48.077612 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.077444 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerName="pull" Apr 23 18:16:48.077612 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.077461 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerName="util" Apr 23 18:16:48.077612 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.077467 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerName="util" Apr 23 18:16:48.077612 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.077475 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerName="extract" Apr 23 18:16:48.077612 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.077481 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerName="extract" Apr 23 18:16:48.077612 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.077530 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="955a9868-3cb7-4d5a-9d7b-95d0d09d0a93" containerName="extract" Apr 23 18:16:48.084059 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.084039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.086965 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.086945 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 18:16:48.087043 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.086948 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 18:16:48.088166 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.088149 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b7cnz\"" Apr 23 18:16:48.090084 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.090061 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742"] Apr 23 18:16:48.162267 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.162230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.162461 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.162293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.162461 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.162343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87tbd\" (UniqueName: \"kubernetes.io/projected/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-kube-api-access-87tbd\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.262801 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.262747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.262996 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.262837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.262996 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.262893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87tbd\" (UniqueName: \"kubernetes.io/projected/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-kube-api-access-87tbd\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.263163 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.263133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.263224 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.263166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.273781 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.273757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87tbd\" (UniqueName: \"kubernetes.io/projected/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-kube-api-access-87tbd\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.395064 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.394974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:48.517169 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.517136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742"] Apr 23 18:16:48.520052 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:16:48.520023 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c47cd8b_bc3e_4dff_b039_b7c0e74b622b.slice/crio-f5192f88fb66ce09aad292b57540fb1bf009b1d3542f34c502c58eb35e5d9268 WatchSource:0}: Error finding container f5192f88fb66ce09aad292b57540fb1bf009b1d3542f34c502c58eb35e5d9268: Status 404 returned error can't find the container with id f5192f88fb66ce09aad292b57540fb1bf009b1d3542f34c502c58eb35e5d9268 Apr 23 18:16:48.616599 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.616564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" event={"ID":"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b","Type":"ContainerStarted","Data":"7da7e4e2ca5fe051ff22cf42aaa153463dd4ce5c07799245655c5497221f541a"} Apr 23 18:16:48.616599 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:48.616600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" event={"ID":"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b","Type":"ContainerStarted","Data":"f5192f88fb66ce09aad292b57540fb1bf009b1d3542f34c502c58eb35e5d9268"} Apr 23 18:16:49.621445 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:49.621409 2576 generic.go:358] "Generic (PLEG): container finished" podID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerID="7da7e4e2ca5fe051ff22cf42aaa153463dd4ce5c07799245655c5497221f541a" exitCode=0 Apr 23 18:16:49.621445 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:49.621445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" event={"ID":"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b","Type":"ContainerDied","Data":"7da7e4e2ca5fe051ff22cf42aaa153463dd4ce5c07799245655c5497221f541a"} Apr 23 18:16:52.633446 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:52.633404 2576 generic.go:358] "Generic (PLEG): container finished" podID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerID="50c0d4aabbb7746211484cb2d1ee9d7cd325ece705aa17bb9fd25c487bc75d7e" exitCode=0 Apr 23 18:16:52.633841 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:52.633467 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" event={"ID":"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b","Type":"ContainerDied","Data":"50c0d4aabbb7746211484cb2d1ee9d7cd325ece705aa17bb9fd25c487bc75d7e"} Apr 23 18:16:53.638268 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:53.638228 2576 generic.go:358] "Generic (PLEG): container finished" podID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerID="453214e0f5003eb5fd866af11c42de50dce5bb4a11d810cf2c54f7e2956e6794" exitCode=0 Apr 23 18:16:53.638772 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:53.638279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" event={"ID":"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b","Type":"ContainerDied","Data":"453214e0f5003eb5fd866af11c42de50dce5bb4a11d810cf2c54f7e2956e6794"} Apr 23 18:16:54.762574 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.762549 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:54.819994 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.819957 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-bundle\") pod \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " Apr 23 18:16:54.819994 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.819998 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-util\") pod \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " Apr 23 18:16:54.820201 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.820092 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87tbd\" (UniqueName: \"kubernetes.io/projected/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-kube-api-access-87tbd\") pod \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\" (UID: \"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b\") " Apr 23 18:16:54.820396 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.820377 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-bundle" (OuterVolumeSpecName: "bundle") pod "2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" (UID: "2c47cd8b-bc3e-4dff-b039-b7c0e74b622b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:16:54.822222 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.822193 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-kube-api-access-87tbd" (OuterVolumeSpecName: "kube-api-access-87tbd") pod "2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" (UID: "2c47cd8b-bc3e-4dff-b039-b7c0e74b622b"). InnerVolumeSpecName "kube-api-access-87tbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:16:54.824615 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.824592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-util" (OuterVolumeSpecName: "util") pod "2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" (UID: "2c47cd8b-bc3e-4dff-b039-b7c0e74b622b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:16:54.920832 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.920713 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:16:54.920832 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.920769 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:16:54.920832 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:54.920778 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87tbd\" (UniqueName: \"kubernetes.io/projected/2c47cd8b-bc3e-4dff-b039-b7c0e74b622b-kube-api-access-87tbd\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:16:55.646660 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:55.646628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" event={"ID":"2c47cd8b-bc3e-4dff-b039-b7c0e74b622b","Type":"ContainerDied","Data":"f5192f88fb66ce09aad292b57540fb1bf009b1d3542f34c502c58eb35e5d9268"} Apr 23 18:16:55.646660 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:55.646655 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7k742" Apr 23 18:16:55.646869 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:16:55.646658 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5192f88fb66ce09aad292b57540fb1bf009b1d3542f34c502c58eb35e5d9268" Apr 23 18:17:03.085070 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.085037 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-557gz"] Apr 23 18:17:03.085425 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.085335 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerName="extract" Apr 23 18:17:03.085425 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.085346 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerName="extract" Apr 23 18:17:03.085425 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.085355 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerName="pull" Apr 23 18:17:03.085425 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.085360 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerName="pull" Apr 23 18:17:03.085425 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.085375 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerName="util" Apr 23 18:17:03.085425 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.085380 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerName="util" Apr 23 18:17:03.085662 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.085430 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c47cd8b-bc3e-4dff-b039-b7c0e74b622b" containerName="extract" Apr 23 18:17:03.087383 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.087365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-557gz" Apr 23 18:17:03.090438 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.090413 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 18:17:03.090556 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.090458 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 18:17:03.091483 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.091469 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-nw5jp\"" Apr 23 18:17:03.099597 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.099574 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-557gz"] Apr 23 18:17:03.191226 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.191189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vrvw\" (UniqueName: \"kubernetes.io/projected/d5216e89-da37-4356-bd6f-3c822ec096a9-kube-api-access-5vrvw\") pod \"cert-manager-79c8d999ff-557gz\" (UID: \"d5216e89-da37-4356-bd6f-3c822ec096a9\") " pod="cert-manager/cert-manager-79c8d999ff-557gz" Apr 23 18:17:03.191389 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.191252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5216e89-da37-4356-bd6f-3c822ec096a9-bound-sa-token\") pod \"cert-manager-79c8d999ff-557gz\" (UID: \"d5216e89-da37-4356-bd6f-3c822ec096a9\") " pod="cert-manager/cert-manager-79c8d999ff-557gz" Apr 23 18:17:03.292400 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.292356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vrvw\" (UniqueName: \"kubernetes.io/projected/d5216e89-da37-4356-bd6f-3c822ec096a9-kube-api-access-5vrvw\") pod \"cert-manager-79c8d999ff-557gz\" (UID: \"d5216e89-da37-4356-bd6f-3c822ec096a9\") " pod="cert-manager/cert-manager-79c8d999ff-557gz" Apr 23 18:17:03.292603 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.292418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5216e89-da37-4356-bd6f-3c822ec096a9-bound-sa-token\") pod \"cert-manager-79c8d999ff-557gz\" (UID: \"d5216e89-da37-4356-bd6f-3c822ec096a9\") " pod="cert-manager/cert-manager-79c8d999ff-557gz" Apr 23 18:17:03.301207 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.301179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vrvw\" (UniqueName: \"kubernetes.io/projected/d5216e89-da37-4356-bd6f-3c822ec096a9-kube-api-access-5vrvw\") pod \"cert-manager-79c8d999ff-557gz\" (UID: \"d5216e89-da37-4356-bd6f-3c822ec096a9\") " pod="cert-manager/cert-manager-79c8d999ff-557gz" Apr 23 18:17:03.301207 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.301191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5216e89-da37-4356-bd6f-3c822ec096a9-bound-sa-token\") pod \"cert-manager-79c8d999ff-557gz\" (UID: \"d5216e89-da37-4356-bd6f-3c822ec096a9\") " pod="cert-manager/cert-manager-79c8d999ff-557gz" Apr 23 18:17:03.395760 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.395642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-557gz" Apr 23 18:17:03.516790 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.516759 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-557gz"] Apr 23 18:17:03.519475 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:03.519446 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5216e89_da37_4356_bd6f_3c822ec096a9.slice/crio-d0cee6924fe92f7a7425db6793642d1b17390d4cfea820a987f37bed3081d4a9 WatchSource:0}: Error finding container d0cee6924fe92f7a7425db6793642d1b17390d4cfea820a987f37bed3081d4a9: Status 404 returned error can't find the container with id d0cee6924fe92f7a7425db6793642d1b17390d4cfea820a987f37bed3081d4a9 Apr 23 18:17:03.674230 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:03.674146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-557gz" event={"ID":"d5216e89-da37-4356-bd6f-3c822ec096a9","Type":"ContainerStarted","Data":"d0cee6924fe92f7a7425db6793642d1b17390d4cfea820a987f37bed3081d4a9"} Apr 23 18:17:06.685671 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:06.685629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-557gz" event={"ID":"d5216e89-da37-4356-bd6f-3c822ec096a9","Type":"ContainerStarted","Data":"75abe1b0605064092b31481ad3ffd62e3bdd1e4f27b845b19df7b4b2f6e5dfaa"} Apr 23 18:17:06.704294 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:06.704243 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-557gz" podStartSLOduration=1.286388952 podStartE2EDuration="3.704228039s" podCreationTimestamp="2026-04-23 18:17:03 +0000 UTC" firstStartedPulling="2026-04-23 18:17:03.521235255 +0000 UTC m=+2131.451247868" lastFinishedPulling="2026-04-23 18:17:05.939074337 +0000 UTC m=+2133.869086955" observedRunningTime="2026-04-23 18:17:06.702301961 +0000 UTC m=+2134.632314597" watchObservedRunningTime="2026-04-23 18:17:06.704228039 +0000 UTC m=+2134.634240675" Apr 23 18:17:15.715585 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.715505 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9"] Apr 23 18:17:15.719158 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.719139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.722211 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.722175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b7cnz\"" Apr 23 18:17:15.722375 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.722318 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 18:17:15.723327 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.723311 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 18:17:15.729692 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.729670 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9"] Apr 23 18:17:15.801030 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.800989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.801030 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.801032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.801281 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.801065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f286\" (UniqueName: \"kubernetes.io/projected/0f37d7ba-71a4-4468-820b-c02f169a2faf-kube-api-access-8f286\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.901826 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.901783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.901826 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.901828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.902031 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.901849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f286\" (UniqueName: \"kubernetes.io/projected/0f37d7ba-71a4-4468-820b-c02f169a2faf-kube-api-access-8f286\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.902217 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.902197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.902256 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.902207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:15.912319 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:15.912279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f286\" (UniqueName: \"kubernetes.io/projected/0f37d7ba-71a4-4468-820b-c02f169a2faf-kube-api-access-8f286\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:16.029034 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:16.028959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:16.155361 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:16.155336 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9"] Apr 23 18:17:16.158112 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:16.158074 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f37d7ba_71a4_4468_820b_c02f169a2faf.slice/crio-a4cfbe2922b7f080493b1778bc7175a0667e7eddf9dcafaaa39b566a1429593a WatchSource:0}: Error finding container a4cfbe2922b7f080493b1778bc7175a0667e7eddf9dcafaaa39b566a1429593a: Status 404 returned error can't find the container with id a4cfbe2922b7f080493b1778bc7175a0667e7eddf9dcafaaa39b566a1429593a Apr 23 18:17:16.720027 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:16.719990 2576 generic.go:358] "Generic (PLEG): container finished" podID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerID="c937d0b543abe991c84a21650f1a01bf0166aea591495582964215256ef675b0" exitCode=0 Apr 23 18:17:16.720510 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:16.720076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" event={"ID":"0f37d7ba-71a4-4468-820b-c02f169a2faf","Type":"ContainerDied","Data":"c937d0b543abe991c84a21650f1a01bf0166aea591495582964215256ef675b0"} Apr 23 18:17:16.720510 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:16.720109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" event={"ID":"0f37d7ba-71a4-4468-820b-c02f169a2faf","Type":"ContainerStarted","Data":"a4cfbe2922b7f080493b1778bc7175a0667e7eddf9dcafaaa39b566a1429593a"} Apr 23 18:17:17.725296 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:17.725265 2576 generic.go:358] "Generic (PLEG): container finished" podID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerID="a864ba3d498be6851fa864da9f0eb4401d136c23631950b0eee0c58e6084c7a5" exitCode=0 Apr 23 18:17:17.725686 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:17.725342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" event={"ID":"0f37d7ba-71a4-4468-820b-c02f169a2faf","Type":"ContainerDied","Data":"a864ba3d498be6851fa864da9f0eb4401d136c23631950b0eee0c58e6084c7a5"} Apr 23 18:17:18.730282 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:18.730245 2576 generic.go:358] "Generic (PLEG): container finished" podID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerID="1c64bd1cb2a47a05e84676b9e9ee56f3fbc066f5704a4922569cc014a0407f74" exitCode=0 Apr 23 18:17:18.730703 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:18.730327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" event={"ID":"0f37d7ba-71a4-4468-820b-c02f169a2faf","Type":"ContainerDied","Data":"1c64bd1cb2a47a05e84676b9e9ee56f3fbc066f5704a4922569cc014a0407f74"} Apr 23 18:17:19.852314 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:19.852291 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:19.936634 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:19.936597 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f286\" (UniqueName: \"kubernetes.io/projected/0f37d7ba-71a4-4468-820b-c02f169a2faf-kube-api-access-8f286\") pod \"0f37d7ba-71a4-4468-820b-c02f169a2faf\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " Apr 23 18:17:19.936820 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:19.936653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-util\") pod \"0f37d7ba-71a4-4468-820b-c02f169a2faf\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " Apr 23 18:17:19.936899 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:19.936834 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-bundle\") pod \"0f37d7ba-71a4-4468-820b-c02f169a2faf\" (UID: \"0f37d7ba-71a4-4468-820b-c02f169a2faf\") " Apr 23 18:17:19.937656 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:19.937628 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-bundle" (OuterVolumeSpecName: "bundle") pod "0f37d7ba-71a4-4468-820b-c02f169a2faf" (UID: "0f37d7ba-71a4-4468-820b-c02f169a2faf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:17:19.938743 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:19.938709 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f37d7ba-71a4-4468-820b-c02f169a2faf-kube-api-access-8f286" (OuterVolumeSpecName: "kube-api-access-8f286") pod "0f37d7ba-71a4-4468-820b-c02f169a2faf" (UID: "0f37d7ba-71a4-4468-820b-c02f169a2faf"). InnerVolumeSpecName "kube-api-access-8f286". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:17:19.942080 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:19.942059 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-util" (OuterVolumeSpecName: "util") pod "0f37d7ba-71a4-4468-820b-c02f169a2faf" (UID: "0f37d7ba-71a4-4468-820b-c02f169a2faf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:17:20.037651 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:20.037554 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:17:20.037651 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:20.037590 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8f286\" (UniqueName: \"kubernetes.io/projected/0f37d7ba-71a4-4468-820b-c02f169a2faf-kube-api-access-8f286\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:17:20.037651 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:20.037600 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f37d7ba-71a4-4468-820b-c02f169a2faf-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:17:20.739322 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:20.739289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" event={"ID":"0f37d7ba-71a4-4468-820b-c02f169a2faf","Type":"ContainerDied","Data":"a4cfbe2922b7f080493b1778bc7175a0667e7eddf9dcafaaa39b566a1429593a"} Apr 23 18:17:20.739322 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:20.739322 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835ld8c9" Apr 23 18:17:20.739501 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:20.739325 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4cfbe2922b7f080493b1778bc7175a0667e7eddf9dcafaaa39b566a1429593a" Apr 23 18:17:30.713484 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.713447 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8"] Apr 23 18:17:30.713956 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.713770 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerName="util" Apr 23 18:17:30.713956 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.713784 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerName="util" Apr 23 18:17:30.713956 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.713798 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerName="pull" Apr 23 18:17:30.713956 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.713803 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerName="pull" Apr 23 18:17:30.713956 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.713811 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerName="extract" Apr 23 18:17:30.713956 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.713817 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerName="extract" Apr 23 18:17:30.713956 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.713861 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f37d7ba-71a4-4468-820b-c02f169a2faf" containerName="extract" Apr 23 18:17:30.721279 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.721261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.725115 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.725093 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 18:17:30.725257 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.725191 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 18:17:30.726453 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.726436 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b7cnz\"" Apr 23 18:17:30.730066 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.730044 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8"] Apr 23 18:17:30.826480 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.826442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg4js\" (UniqueName: \"kubernetes.io/projected/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-kube-api-access-kg4js\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.826663 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.826545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.826663 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.826613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.928104 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.928068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kg4js\" (UniqueName: \"kubernetes.io/projected/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-kube-api-access-kg4js\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.928104 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.928119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.928358 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.928157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.928574 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.928553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.928642 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.928574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:30.938991 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:30.938961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg4js\" (UniqueName: \"kubernetes.io/projected/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-kube-api-access-kg4js\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:31.031417 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:31.031330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:31.154557 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:31.154498 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8"] Apr 23 18:17:31.157014 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:31.156985 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d7336c3_a2e4_4e0f_8754_5cd0ec008d9d.slice/crio-1dba8e61542e2497268a34abcc3182086e1e303ae212b607b204157a9d950f9a WatchSource:0}: Error finding container 1dba8e61542e2497268a34abcc3182086e1e303ae212b607b204157a9d950f9a: Status 404 returned error can't find the container with id 1dba8e61542e2497268a34abcc3182086e1e303ae212b607b204157a9d950f9a Apr 23 18:17:31.777395 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:31.777358 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerID="d8f7422f9eb3bc170176b897a52525e8779066eabb55c3088050625a12052e4b" exitCode=0 Apr 23 18:17:31.777805 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:31.777401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" event={"ID":"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d","Type":"ContainerDied","Data":"d8f7422f9eb3bc170176b897a52525e8779066eabb55c3088050625a12052e4b"} Apr 23 18:17:31.777805 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:31.777427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" event={"ID":"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d","Type":"ContainerStarted","Data":"1dba8e61542e2497268a34abcc3182086e1e303ae212b607b204157a9d950f9a"} Apr 23 18:17:32.781898 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:32.781862 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerID="85720d93a053367fe8cc9187293ad62d93a2195b5a9d62a83460d39c275c7f84" exitCode=0 Apr 23 18:17:32.782267 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:32.781906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" event={"ID":"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d","Type":"ContainerDied","Data":"85720d93a053367fe8cc9187293ad62d93a2195b5a9d62a83460d39c275c7f84"} Apr 23 18:17:32.883256 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:32.883220 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt"] Apr 23 18:17:32.886471 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:32.886455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:32.890842 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:32.890815 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-m7b8w\"" Apr 23 18:17:32.891176 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:32.891153 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 23 18:17:32.891176 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:32.891169 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 23 18:17:32.906374 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:32.906346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt"] Apr 23 18:17:33.046153 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.046062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xzv\" (UniqueName: \"kubernetes.io/projected/0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc-kube-api-access-v6xzv\") pod \"servicemesh-operator3-55f49c5f94-cl6rt\" (UID: \"0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:33.046297 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.046181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cl6rt\" (UID: \"0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:33.147500 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.147467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xzv\" (UniqueName: \"kubernetes.io/projected/0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc-kube-api-access-v6xzv\") pod \"servicemesh-operator3-55f49c5f94-cl6rt\" (UID: \"0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:33.147684 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.147557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cl6rt\" (UID: \"0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:33.149985 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.149965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cl6rt\" (UID: \"0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:33.156202 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.156180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xzv\" (UniqueName: \"kubernetes.io/projected/0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc-kube-api-access-v6xzv\") pod \"servicemesh-operator3-55f49c5f94-cl6rt\" (UID: \"0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:33.196932 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.196907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:33.324949 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.324925 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt"] Apr 23 18:17:33.327634 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:33.327603 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d9d8b62_38e2_4bdf_bbdb_4d9119bc67dc.slice/crio-7e51987a1883757276e0152f1efc177784bb3a401d45468759a915259ed4d422 WatchSource:0}: Error finding container 7e51987a1883757276e0152f1efc177784bb3a401d45468759a915259ed4d422: Status 404 returned error can't find the container with id 7e51987a1883757276e0152f1efc177784bb3a401d45468759a915259ed4d422 Apr 23 18:17:33.786265 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.786165 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" event={"ID":"0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc","Type":"ContainerStarted","Data":"7e51987a1883757276e0152f1efc177784bb3a401d45468759a915259ed4d422"} Apr 23 18:17:33.788017 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.787993 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerID="af87f6638d6cb80229870bb0463fa6e05b5ffd9acb9f0a83040ad2f9d051ae67" exitCode=0 Apr 23 18:17:33.788134 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:33.788082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" event={"ID":"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d","Type":"ContainerDied","Data":"af87f6638d6cb80229870bb0463fa6e05b5ffd9acb9f0a83040ad2f9d051ae67"} Apr 23 18:17:34.929812 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:34.929782 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:34.962787 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:34.962753 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-util\") pod \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " Apr 23 18:17:34.962923 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:34.962803 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg4js\" (UniqueName: \"kubernetes.io/projected/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-kube-api-access-kg4js\") pod \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " Apr 23 18:17:34.962923 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:34.962829 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-bundle\") pod \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\" (UID: \"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d\") " Apr 23 18:17:34.964291 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:34.963987 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-bundle" (OuterVolumeSpecName: "bundle") pod "2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" (UID: "2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:17:34.966769 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:34.966628 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-kube-api-access-kg4js" (OuterVolumeSpecName: "kube-api-access-kg4js") pod "2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" (UID: "2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d"). InnerVolumeSpecName "kube-api-access-kg4js". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:17:34.969677 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:34.969636 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-util" (OuterVolumeSpecName: "util") pod "2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" (UID: "2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:17:35.063993 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:35.063959 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:17:35.063993 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:35.063989 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kg4js\" (UniqueName: \"kubernetes.io/projected/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-kube-api-access-kg4js\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:17:35.063993 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:35.064003 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:17:35.798484 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:35.798452 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" Apr 23 18:17:35.798653 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:35.798453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebj6fz8" event={"ID":"2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d","Type":"ContainerDied","Data":"1dba8e61542e2497268a34abcc3182086e1e303ae212b607b204157a9d950f9a"} Apr 23 18:17:35.798653 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:35.798568 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dba8e61542e2497268a34abcc3182086e1e303ae212b607b204157a9d950f9a" Apr 23 18:17:36.803787 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:36.803753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" event={"ID":"0d9d8b62-38e2-4bdf-bbdb-4d9119bc67dc","Type":"ContainerStarted","Data":"bb9dfba56c2905515576a99159f51d6f7698917e49553aabaa20e27e0446c363"} Apr 23 18:17:36.804174 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:36.803831 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:36.827605 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:36.827544 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" podStartSLOduration=2.231197086 podStartE2EDuration="4.827523787s" podCreationTimestamp="2026-04-23 18:17:32 +0000 UTC" firstStartedPulling="2026-04-23 18:17:33.330028091 +0000 UTC m=+2161.260040705" lastFinishedPulling="2026-04-23 18:17:35.926354777 +0000 UTC m=+2163.856367406" observedRunningTime="2026-04-23 18:17:36.825050691 +0000 UTC m=+2164.755063328" watchObservedRunningTime="2026-04-23 18:17:36.827523787 +0000 UTC m=+2164.757536424" Apr 23 18:17:40.454362 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.454322 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj"] Apr 23 18:17:40.454821 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.454698 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerName="util" Apr 23 18:17:40.454821 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.454710 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerName="util" Apr 23 18:17:40.454821 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.454737 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerName="pull" Apr 23 18:17:40.454821 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.454742 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerName="pull" Apr 23 18:17:40.454821 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.454752 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerName="extract" Apr 23 18:17:40.454821 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.454758 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerName="extract" Apr 23 18:17:40.454821 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.454811 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d7336c3-a2e4-4e0f-8754-5cd0ec008d9d" containerName="extract" Apr 23 18:17:40.458315 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.458297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.461305 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.461281 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 18:17:40.461542 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.461309 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 23 18:17:40.461652 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.461308 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 23 18:17:40.461728 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.461335 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 23 18:17:40.461788 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.461351 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-xvnp5\"" Apr 23 18:17:40.461788 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.461353 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 18:17:40.461851 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.461430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 23 18:17:40.469439 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.469411 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj"] Apr 23 18:17:40.512608 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.512570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jmrg\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-kube-api-access-4jmrg\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.512783 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.512625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.512783 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.512685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.512783 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.512743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8bded537-e03b-4ef3-8d6a-a5db985fdedb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.512914 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.512788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.512914 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.512809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.512914 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.512843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.614157 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.614115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.614157 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.614165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8bded537-e03b-4ef3-8d6a-a5db985fdedb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.614472 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.614224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.614472 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.614251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.614472 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.614298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.614472 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.614341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jmrg\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-kube-api-access-4jmrg\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.614472 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.614401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.615153 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.615081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.616960 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.616929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.617156 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.617139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.617199 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.617157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8bded537-e03b-4ef3-8d6a-a5db985fdedb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.617264 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.617243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.626426 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.626395 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.627082 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.627059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jmrg\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-kube-api-access-4jmrg\") pod \"istiod-openshift-gateway-7cd77c7ffd-h4tbj\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.768429 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.768328 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:40.911580 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:40.911548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj"] Apr 23 18:17:40.914200 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:40.914169 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bded537_e03b_4ef3_8d6a_a5db985fdedb.slice/crio-fe14fa4beaac4055a4fef9a5168543df728711cd11253546ee9ec2ae730bd4ce WatchSource:0}: Error finding container fe14fa4beaac4055a4fef9a5168543df728711cd11253546ee9ec2ae730bd4ce: Status 404 returned error can't find the container with id fe14fa4beaac4055a4fef9a5168543df728711cd11253546ee9ec2ae730bd4ce Apr 23 18:17:41.826065 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:41.826017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" event={"ID":"8bded537-e03b-4ef3-8d6a-a5db985fdedb","Type":"ContainerStarted","Data":"fe14fa4beaac4055a4fef9a5168543df728711cd11253546ee9ec2ae730bd4ce"} Apr 23 18:17:43.243173 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:43.243118 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:17:43.243441 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:43.243216 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:17:43.837537 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:43.837492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" event={"ID":"8bded537-e03b-4ef3-8d6a-a5db985fdedb","Type":"ContainerStarted","Data":"38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189"} Apr 23 18:17:43.837728 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:43.837567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:43.865913 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:43.865839 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" podStartSLOduration=1.5391853869999998 podStartE2EDuration="3.865819851s" podCreationTimestamp="2026-04-23 18:17:40 +0000 UTC" firstStartedPulling="2026-04-23 18:17:40.916212728 +0000 UTC m=+2168.846225346" lastFinishedPulling="2026-04-23 18:17:43.24284718 +0000 UTC m=+2171.172859810" observedRunningTime="2026-04-23 18:17:43.86337279 +0000 UTC m=+2171.793385442" watchObservedRunningTime="2026-04-23 18:17:43.865819851 +0000 UTC m=+2171.795832491" Apr 23 18:17:44.842961 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:44.842927 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-h4tbj container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 23 18:17:44.843354 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:44.842979 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" podUID="8bded537-e03b-4ef3-8d6a-a5db985fdedb" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:17:47.810133 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:47.810101 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cl6rt" Apr 23 18:17:47.842776 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:47.842744 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:17:48.261750 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.261650 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2"] Apr 23 18:17:48.265211 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.265184 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.268091 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.268062 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-d6v9w\"" Apr 23 18:17:48.287124 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.287082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2"] Apr 23 18:17:48.385986 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.385945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.386181 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.386012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.386181 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.386119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6npc\" (UniqueName: \"kubernetes.io/projected/f2afacd4-04d7-4d25-aa31-0e507e733b70-kube-api-access-g6npc\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.386313 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.386177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.386313 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.386203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.386313 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.386234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.386313 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.386260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f2afacd4-04d7-4d25-aa31-0e507e733b70-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.386313 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.386293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.386526 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.386370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.487652 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.487610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.487862 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.487659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.487862 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.487701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6npc\" (UniqueName: \"kubernetes.io/projected/f2afacd4-04d7-4d25-aa31-0e507e733b70-kube-api-access-g6npc\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.487862 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.487753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488033 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.487890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488033 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.487944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488033 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.487975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f2afacd4-04d7-4d25-aa31-0e507e733b70-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488033 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.488012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488238 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.488130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-credential-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488238 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.488140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488238 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.488175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-workload-socket\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488473 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.488448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-data\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488603 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.488465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-workload-certs\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.488703 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.488676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f2afacd4-04d7-4d25-aa31-0e507e733b70-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.490370 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.490350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-envoy\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.490590 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.490567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.497517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.497491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f2afacd4-04d7-4d25-aa31-0e507e733b70-istio-token\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.497662 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.497604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6npc\" (UniqueName: \"kubernetes.io/projected/f2afacd4-04d7-4d25-aa31-0e507e733b70-kube-api-access-g6npc\") pod \"openshift-ai-inference-openshift-default-6b94bb86d8-j64f2\" (UID: \"f2afacd4-04d7-4d25-aa31-0e507e733b70\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.587219 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.587182 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:48.727208 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.727177 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2"] Apr 23 18:17:48.729915 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:48.729882 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2afacd4_04d7_4d25_aa31_0e507e733b70.slice/crio-1c974ac499e991f39d968023a3838face6b0d74f0e290905bcda644f3699707d WatchSource:0}: Error finding container 1c974ac499e991f39d968023a3838face6b0d74f0e290905bcda644f3699707d: Status 404 returned error can't find the container with id 1c974ac499e991f39d968023a3838face6b0d74f0e290905bcda644f3699707d Apr 23 18:17:48.855504 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:48.855416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" event={"ID":"f2afacd4-04d7-4d25-aa31-0e507e733b70","Type":"ContainerStarted","Data":"1c974ac499e991f39d968023a3838face6b0d74f0e290905bcda644f3699707d"} Apr 23 18:17:51.211352 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:51.211310 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:17:51.211755 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:51.211386 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:17:51.211755 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:51.211414 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:17:51.869684 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:51.869642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" event={"ID":"f2afacd4-04d7-4d25-aa31-0e507e733b70","Type":"ContainerStarted","Data":"bc042812dc3d9069220e5d2d151d36a0af41172830eefbfdbecef867a8ad1cae"} Apr 23 18:17:51.896002 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:51.895955 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" podStartSLOduration=1.416804442 podStartE2EDuration="3.895939068s" podCreationTimestamp="2026-04-23 18:17:48 +0000 UTC" firstStartedPulling="2026-04-23 18:17:48.731911756 +0000 UTC m=+2176.661924388" lastFinishedPulling="2026-04-23 18:17:51.2110464 +0000 UTC m=+2179.141059014" observedRunningTime="2026-04-23 18:17:51.892307425 +0000 UTC m=+2179.822320060" watchObservedRunningTime="2026-04-23 18:17:51.895939068 +0000 UTC m=+2179.825951704" Apr 23 18:17:52.587579 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:52.587534 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:52.592495 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:52.592471 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:52.873503 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:52.873410 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:52.874378 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:52.874356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-6b94bb86d8-j64f2" Apr 23 18:17:55.984508 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:55.984470 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt"] Apr 23 18:17:55.988057 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:55.988038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:55.991243 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:55.991220 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 18:17:55.991373 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:55.991251 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 18:17:55.992657 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:55.992642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b7cnz\"" Apr 23 18:17:56.002501 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.000750 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt"] Apr 23 18:17:56.058780 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.058739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.058954 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.058822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfvm\" (UniqueName: \"kubernetes.io/projected/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-kube-api-access-7qfvm\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.058954 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.058904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.087987 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.087954 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46"] Apr 23 18:17:56.091535 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.091513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.100847 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.100821 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46"] Apr 23 18:17:56.159565 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.159521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.159783 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.159589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.159783 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.159617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.159783 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.159646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfvm\" (UniqueName: \"kubernetes.io/projected/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-kube-api-access-7qfvm\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.159783 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.159671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtzt\" (UniqueName: \"kubernetes.io/projected/0ced8e25-b230-4af8-84e9-bfdad496ff96-kube-api-access-mbtzt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.159982 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.159806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.159982 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.159966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.160044 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.160007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.169374 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.169346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfvm\" (UniqueName: \"kubernetes.io/projected/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-kube-api-access-7qfvm\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.198701 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.198665 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt"] Apr 23 18:17:56.202473 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.202459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.213305 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.213281 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt"] Apr 23 18:17:56.261266 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.261179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.261266 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.261220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.261266 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.261240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jfg7\" (UniqueName: \"kubernetes.io/projected/a8993fa0-0c64-46e7-ae72-46b8769c2a85-kube-api-access-9jfg7\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.261567 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.261273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtzt\" (UniqueName: \"kubernetes.io/projected/0ced8e25-b230-4af8-84e9-bfdad496ff96-kube-api-access-mbtzt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.261567 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.261344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.261567 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.261404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.261662 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.261574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.261696 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.261662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.272327 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.272304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtzt\" (UniqueName: \"kubernetes.io/projected/0ced8e25-b230-4af8-84e9-bfdad496ff96-kube-api-access-mbtzt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.288439 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.288404 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg"] Apr 23 18:17:56.292072 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.292053 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.299693 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.299665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:17:56.302778 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.302757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg"] Apr 23 18:17:56.362804 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.362765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.362991 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.362816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jfg7\" (UniqueName: \"kubernetes.io/projected/a8993fa0-0c64-46e7-ae72-46b8769c2a85-kube-api-access-9jfg7\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.362991 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.362848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvsqv\" (UniqueName: \"kubernetes.io/projected/c5c6717e-7c30-473f-84fc-0e0299571336-kube-api-access-cvsqv\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.362991 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.362902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.362991 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.362929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.363217 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.363004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.363618 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.363515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.363618 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.363599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.374674 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.374648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jfg7\" (UniqueName: \"kubernetes.io/projected/a8993fa0-0c64-46e7-ae72-46b8769c2a85-kube-api-access-9jfg7\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.402054 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.402021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:17:56.436575 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.436544 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt"] Apr 23 18:17:56.437971 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:56.437942 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod402ddad4_693f_4d7d_8a1b_dd7e80ecb103.slice/crio-2d587e1a3cdca679080f69768ca6b406051ea2eff64cf8ea7fe3351e541cb31b WatchSource:0}: Error finding container 2d587e1a3cdca679080f69768ca6b406051ea2eff64cf8ea7fe3351e541cb31b: Status 404 returned error can't find the container with id 2d587e1a3cdca679080f69768ca6b406051ea2eff64cf8ea7fe3351e541cb31b Apr 23 18:17:56.464849 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.464430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvsqv\" (UniqueName: \"kubernetes.io/projected/c5c6717e-7c30-473f-84fc-0e0299571336-kube-api-access-cvsqv\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.464849 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.464494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.464849 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.464524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.465082 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.464897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.465160 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.465140 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.475916 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.475888 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvsqv\" (UniqueName: \"kubernetes.io/projected/c5c6717e-7c30-473f-84fc-0e0299571336-kube-api-access-cvsqv\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.512598 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.512521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:17:56.538250 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.538207 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46"] Apr 23 18:17:56.540921 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:56.540829 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ced8e25_b230_4af8_84e9_bfdad496ff96.slice/crio-6decfdd968ec16a206804ee2c31e5738a09f11368fa97595f405a30355849d38 WatchSource:0}: Error finding container 6decfdd968ec16a206804ee2c31e5738a09f11368fa97595f405a30355849d38: Status 404 returned error can't find the container with id 6decfdd968ec16a206804ee2c31e5738a09f11368fa97595f405a30355849d38 Apr 23 18:17:56.602560 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.602531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:17:56.649855 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:56.649824 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8993fa0_0c64_46e7_ae72_46b8769c2a85.slice/crio-6f282709a1293780d4006f0a34e3691ea4ebb4d70b46dc808cd01f4df903cb47 WatchSource:0}: Error finding container 6f282709a1293780d4006f0a34e3691ea4ebb4d70b46dc808cd01f4df903cb47: Status 404 returned error can't find the container with id 6f282709a1293780d4006f0a34e3691ea4ebb4d70b46dc808cd01f4df903cb47 Apr 23 18:17:56.651300 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.651262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt"] Apr 23 18:17:56.742365 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.742342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg"] Apr 23 18:17:56.766329 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:17:56.766302 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c6717e_7c30_473f_84fc_0e0299571336.slice/crio-166f6954ed08fd4ec5f63d7bbfaa8d2046f041a5e98c967773e7c1bc246ae1e6 WatchSource:0}: Error finding container 166f6954ed08fd4ec5f63d7bbfaa8d2046f041a5e98c967773e7c1bc246ae1e6: Status 404 returned error can't find the container with id 166f6954ed08fd4ec5f63d7bbfaa8d2046f041a5e98c967773e7c1bc246ae1e6 Apr 23 18:17:56.888048 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.888017 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerID="35a87a909657a0d519d4711db5c4f593000a8be5f972486cc2e65e37c54ad638" exitCode=0 Apr 23 18:17:56.888187 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.888096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" event={"ID":"a8993fa0-0c64-46e7-ae72-46b8769c2a85","Type":"ContainerDied","Data":"35a87a909657a0d519d4711db5c4f593000a8be5f972486cc2e65e37c54ad638"} Apr 23 18:17:56.888187 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.888137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" event={"ID":"a8993fa0-0c64-46e7-ae72-46b8769c2a85","Type":"ContainerStarted","Data":"6f282709a1293780d4006f0a34e3691ea4ebb4d70b46dc808cd01f4df903cb47"} Apr 23 18:17:56.889399 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.889376 2576 generic.go:358] "Generic (PLEG): container finished" podID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerID="8a561fd40e9ad4fc976199c5396a7acb10075450ca866697e2a6eadf6091cc11" exitCode=0 Apr 23 18:17:56.889613 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.889448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" event={"ID":"402ddad4-693f-4d7d-8a1b-dd7e80ecb103","Type":"ContainerDied","Data":"8a561fd40e9ad4fc976199c5396a7acb10075450ca866697e2a6eadf6091cc11"} Apr 23 18:17:56.889613 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.889472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" event={"ID":"402ddad4-693f-4d7d-8a1b-dd7e80ecb103","Type":"ContainerStarted","Data":"2d587e1a3cdca679080f69768ca6b406051ea2eff64cf8ea7fe3351e541cb31b"} Apr 23 18:17:56.890744 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.890706 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c6717e-7c30-473f-84fc-0e0299571336" containerID="3d50d266811fc04ed4f3104979519e28e85d12aae1e8cbab9e5302abca390a94" exitCode=0 Apr 23 18:17:56.890811 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.890746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" event={"ID":"c5c6717e-7c30-473f-84fc-0e0299571336","Type":"ContainerDied","Data":"3d50d266811fc04ed4f3104979519e28e85d12aae1e8cbab9e5302abca390a94"} Apr 23 18:17:56.890811 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.890774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" event={"ID":"c5c6717e-7c30-473f-84fc-0e0299571336","Type":"ContainerStarted","Data":"166f6954ed08fd4ec5f63d7bbfaa8d2046f041a5e98c967773e7c1bc246ae1e6"} Apr 23 18:17:56.894969 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.893455 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerID="d802bebd685d8545fc1b2222976da8b6920d55ac7e3051c0b639be8421529e69" exitCode=0 Apr 23 18:17:56.894969 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.893506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" event={"ID":"0ced8e25-b230-4af8-84e9-bfdad496ff96","Type":"ContainerDied","Data":"d802bebd685d8545fc1b2222976da8b6920d55ac7e3051c0b639be8421529e69"} Apr 23 18:17:56.894969 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:56.893527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" event={"ID":"0ced8e25-b230-4af8-84e9-bfdad496ff96","Type":"ContainerStarted","Data":"6decfdd968ec16a206804ee2c31e5738a09f11368fa97595f405a30355849d38"} Apr 23 18:17:57.899870 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:57.899838 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerID="03035d2cce8fa75d222631c93c0f37873731659282a64810c4c9f44f4f6fa394" exitCode=0 Apr 23 18:17:57.900304 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:57.899928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" event={"ID":"a8993fa0-0c64-46e7-ae72-46b8769c2a85","Type":"ContainerDied","Data":"03035d2cce8fa75d222631c93c0f37873731659282a64810c4c9f44f4f6fa394"} Apr 23 18:17:57.901745 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:57.901706 2576 generic.go:358] "Generic (PLEG): container finished" podID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerID="63b2c6a6d49943e6eff4a81d8a769cbac706de00b4b953badc2ddc942edcb2f5" exitCode=0 Apr 23 18:17:57.901829 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:57.901746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" event={"ID":"402ddad4-693f-4d7d-8a1b-dd7e80ecb103","Type":"ContainerDied","Data":"63b2c6a6d49943e6eff4a81d8a769cbac706de00b4b953badc2ddc942edcb2f5"} Apr 23 18:17:57.903445 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:57.903424 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c6717e-7c30-473f-84fc-0e0299571336" containerID="ec8eff7bf41b9a067118c7f94696acda18a15b93cba276e16d237e274c3139d0" exitCode=0 Apr 23 18:17:57.903557 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:57.903516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" event={"ID":"c5c6717e-7c30-473f-84fc-0e0299571336","Type":"ContainerDied","Data":"ec8eff7bf41b9a067118c7f94696acda18a15b93cba276e16d237e274c3139d0"} Apr 23 18:17:57.905505 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:57.905472 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerID="baab9ebb3f36e90689c896f1c6ef8bd218d3a8a4bcf41ec11c7ebe4c8abcbaa0" exitCode=0 Apr 23 18:17:57.905576 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:57.905505 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" event={"ID":"0ced8e25-b230-4af8-84e9-bfdad496ff96","Type":"ContainerDied","Data":"baab9ebb3f36e90689c896f1c6ef8bd218d3a8a4bcf41ec11c7ebe4c8abcbaa0"} Apr 23 18:17:58.911320 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:58.911282 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerID="d5b9def5ba88652f4146a1163e022ae546dc04645487184f178c878bf0753a35" exitCode=0 Apr 23 18:17:58.911782 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:58.911374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" event={"ID":"0ced8e25-b230-4af8-84e9-bfdad496ff96","Type":"ContainerDied","Data":"d5b9def5ba88652f4146a1163e022ae546dc04645487184f178c878bf0753a35"} Apr 23 18:17:58.913142 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:58.913121 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerID="2602d28a888c4e881e4696383e606b6a825f3a05cfd8ce4fbc9807f80140f972" exitCode=0 Apr 23 18:17:58.913257 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:58.913205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" event={"ID":"a8993fa0-0c64-46e7-ae72-46b8769c2a85","Type":"ContainerDied","Data":"2602d28a888c4e881e4696383e606b6a825f3a05cfd8ce4fbc9807f80140f972"} Apr 23 18:17:58.914805 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:58.914780 2576 generic.go:358] "Generic (PLEG): container finished" podID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerID="ff2c2d5c0fd62a040c1d324a737d96f2bbedd932ec7d2f022a951e37928eca7b" exitCode=0 Apr 23 18:17:58.914879 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:58.914863 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" event={"ID":"402ddad4-693f-4d7d-8a1b-dd7e80ecb103","Type":"ContainerDied","Data":"ff2c2d5c0fd62a040c1d324a737d96f2bbedd932ec7d2f022a951e37928eca7b"} Apr 23 18:17:58.916598 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:58.916579 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5c6717e-7c30-473f-84fc-0e0299571336" containerID="5da0aab945a15638f55403a33474f646716676de94b587114489543a9fa14427" exitCode=0 Apr 23 18:17:58.916683 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:17:58.916607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" event={"ID":"c5c6717e-7c30-473f-84fc-0e0299571336","Type":"ContainerDied","Data":"5da0aab945a15638f55403a33474f646716676de94b587114489543a9fa14427"} Apr 23 18:18:00.045885 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.045866 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:18:00.079077 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.079051 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:18:00.118376 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.118350 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:18:00.121542 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.121525 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:18:00.197462 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197379 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvsqv\" (UniqueName: \"kubernetes.io/projected/c5c6717e-7c30-473f-84fc-0e0299571336-kube-api-access-cvsqv\") pod \"c5c6717e-7c30-473f-84fc-0e0299571336\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " Apr 23 18:18:00.197462 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197420 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-bundle\") pod \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " Apr 23 18:18:00.197680 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197474 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-util\") pod \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " Apr 23 18:18:00.197680 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197505 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbtzt\" (UniqueName: \"kubernetes.io/projected/0ced8e25-b230-4af8-84e9-bfdad496ff96-kube-api-access-mbtzt\") pod \"0ced8e25-b230-4af8-84e9-bfdad496ff96\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " Apr 23 18:18:00.197680 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197531 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-bundle\") pod \"c5c6717e-7c30-473f-84fc-0e0299571336\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " Apr 23 18:18:00.197680 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197554 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-bundle\") pod \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " Apr 23 18:18:00.197680 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197601 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jfg7\" (UniqueName: \"kubernetes.io/projected/a8993fa0-0c64-46e7-ae72-46b8769c2a85-kube-api-access-9jfg7\") pod \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " Apr 23 18:18:00.197680 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197646 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-bundle\") pod \"0ced8e25-b230-4af8-84e9-bfdad496ff96\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " Apr 23 18:18:00.197680 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197677 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qfvm\" (UniqueName: \"kubernetes.io/projected/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-kube-api-access-7qfvm\") pod \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\" (UID: \"402ddad4-693f-4d7d-8a1b-dd7e80ecb103\") " Apr 23 18:18:00.198018 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197706 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-util\") pod \"0ced8e25-b230-4af8-84e9-bfdad496ff96\" (UID: \"0ced8e25-b230-4af8-84e9-bfdad496ff96\") " Apr 23 18:18:00.198018 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197750 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-util\") pod \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\" (UID: \"a8993fa0-0c64-46e7-ae72-46b8769c2a85\") " Apr 23 18:18:00.198018 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.197776 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-util\") pod \"c5c6717e-7c30-473f-84fc-0e0299571336\" (UID: \"c5c6717e-7c30-473f-84fc-0e0299571336\") " Apr 23 18:18:00.200428 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.198343 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-bundle" (OuterVolumeSpecName: "bundle") pod "a8993fa0-0c64-46e7-ae72-46b8769c2a85" (UID: "a8993fa0-0c64-46e7-ae72-46b8769c2a85"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:00.200428 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.199264 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-bundle" (OuterVolumeSpecName: "bundle") pod "c5c6717e-7c30-473f-84fc-0e0299571336" (UID: "c5c6717e-7c30-473f-84fc-0e0299571336"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:00.200428 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.199521 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-bundle" (OuterVolumeSpecName: "bundle") pod "402ddad4-693f-4d7d-8a1b-dd7e80ecb103" (UID: "402ddad4-693f-4d7d-8a1b-dd7e80ecb103"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:00.200428 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.199879 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-bundle" (OuterVolumeSpecName: "bundle") pod "0ced8e25-b230-4af8-84e9-bfdad496ff96" (UID: "0ced8e25-b230-4af8-84e9-bfdad496ff96"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:00.200428 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.200350 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ced8e25-b230-4af8-84e9-bfdad496ff96-kube-api-access-mbtzt" (OuterVolumeSpecName: "kube-api-access-mbtzt") pod "0ced8e25-b230-4af8-84e9-bfdad496ff96" (UID: "0ced8e25-b230-4af8-84e9-bfdad496ff96"). InnerVolumeSpecName "kube-api-access-mbtzt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:18:00.200898 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.200876 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c6717e-7c30-473f-84fc-0e0299571336-kube-api-access-cvsqv" (OuterVolumeSpecName: "kube-api-access-cvsqv") pod "c5c6717e-7c30-473f-84fc-0e0299571336" (UID: "c5c6717e-7c30-473f-84fc-0e0299571336"). InnerVolumeSpecName "kube-api-access-cvsqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:18:00.200898 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.200887 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-kube-api-access-7qfvm" (OuterVolumeSpecName: "kube-api-access-7qfvm") pod "402ddad4-693f-4d7d-8a1b-dd7e80ecb103" (UID: "402ddad4-693f-4d7d-8a1b-dd7e80ecb103"). InnerVolumeSpecName "kube-api-access-7qfvm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:18:00.201670 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.201651 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8993fa0-0c64-46e7-ae72-46b8769c2a85-kube-api-access-9jfg7" (OuterVolumeSpecName: "kube-api-access-9jfg7") pod "a8993fa0-0c64-46e7-ae72-46b8769c2a85" (UID: "a8993fa0-0c64-46e7-ae72-46b8769c2a85"). InnerVolumeSpecName "kube-api-access-9jfg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:18:00.205945 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.205919 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-util" (OuterVolumeSpecName: "util") pod "c5c6717e-7c30-473f-84fc-0e0299571336" (UID: "c5c6717e-7c30-473f-84fc-0e0299571336"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:00.206424 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.206404 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-util" (OuterVolumeSpecName: "util") pod "0ced8e25-b230-4af8-84e9-bfdad496ff96" (UID: "0ced8e25-b230-4af8-84e9-bfdad496ff96"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:00.206957 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.206933 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-util" (OuterVolumeSpecName: "util") pod "402ddad4-693f-4d7d-8a1b-dd7e80ecb103" (UID: "402ddad4-693f-4d7d-8a1b-dd7e80ecb103"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:00.207399 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.207382 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-util" (OuterVolumeSpecName: "util") pod "a8993fa0-0c64-46e7-ae72-46b8769c2a85" (UID: "a8993fa0-0c64-46e7-ae72-46b8769c2a85"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:18:00.299299 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299262 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cvsqv\" (UniqueName: \"kubernetes.io/projected/c5c6717e-7c30-473f-84fc-0e0299571336-kube-api-access-cvsqv\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299299 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299297 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299299 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299306 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299316 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbtzt\" (UniqueName: \"kubernetes.io/projected/0ced8e25-b230-4af8-84e9-bfdad496ff96-kube-api-access-mbtzt\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299325 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299333 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299341 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9jfg7\" (UniqueName: \"kubernetes.io/projected/a8993fa0-0c64-46e7-ae72-46b8769c2a85-kube-api-access-9jfg7\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299350 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-bundle\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299358 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qfvm\" (UniqueName: \"kubernetes.io/projected/402ddad4-693f-4d7d-8a1b-dd7e80ecb103-kube-api-access-7qfvm\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299369 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ced8e25-b230-4af8-84e9-bfdad496ff96-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299379 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8993fa0-0c64-46e7-ae72-46b8769c2a85-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.299517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.299386 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c6717e-7c30-473f-84fc-0e0299571336-util\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:18:00.925745 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.925623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" event={"ID":"402ddad4-693f-4d7d-8a1b-dd7e80ecb103","Type":"ContainerDied","Data":"2d587e1a3cdca679080f69768ca6b406051ea2eff64cf8ea7fe3351e541cb31b"} Apr 23 18:18:00.925745 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.925666 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bj6vpt" Apr 23 18:18:00.925745 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.925669 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d587e1a3cdca679080f69768ca6b406051ea2eff64cf8ea7fe3351e541cb31b" Apr 23 18:18:00.927362 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.927330 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" event={"ID":"c5c6717e-7c30-473f-84fc-0e0299571336","Type":"ContainerDied","Data":"166f6954ed08fd4ec5f63d7bbfaa8d2046f041a5e98c967773e7c1bc246ae1e6"} Apr 23 18:18:00.927362 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.927363 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="166f6954ed08fd4ec5f63d7bbfaa8d2046f041a5e98c967773e7c1bc246ae1e6" Apr 23 18:18:00.927555 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.927377 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30bdnsg" Apr 23 18:18:00.929600 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.929580 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" Apr 23 18:18:00.929813 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.929565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fps46" event={"ID":"0ced8e25-b230-4af8-84e9-bfdad496ff96","Type":"ContainerDied","Data":"6decfdd968ec16a206804ee2c31e5738a09f11368fa97595f405a30355849d38"} Apr 23 18:18:00.929893 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.929826 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6decfdd968ec16a206804ee2c31e5738a09f11368fa97595f405a30355849d38" Apr 23 18:18:00.931374 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.931350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" event={"ID":"a8993fa0-0c64-46e7-ae72-46b8769c2a85","Type":"ContainerDied","Data":"6f282709a1293780d4006f0a34e3691ea4ebb4d70b46dc808cd01f4df903cb47"} Apr 23 18:18:00.931467 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.931379 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f282709a1293780d4006f0a34e3691ea4ebb4d70b46dc808cd01f4df903cb47" Apr 23 18:18:00.931525 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:00.931474 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503c64jt" Apr 23 18:18:07.289920 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.289880 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476"] Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290211 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerName="util" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290222 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerName="util" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290234 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerName="extract" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290239 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerName="extract" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290250 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerName="pull" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290256 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerName="pull" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290261 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerName="extract" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290267 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerName="extract" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290275 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c6717e-7c30-473f-84fc-0e0299571336" containerName="pull" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290280 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c6717e-7c30-473f-84fc-0e0299571336" containerName="pull" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290288 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerName="util" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290294 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerName="util" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290300 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerName="pull" Apr 23 18:18:07.290303 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290305 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerName="pull" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290313 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c6717e-7c30-473f-84fc-0e0299571336" containerName="extract" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290319 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c6717e-7c30-473f-84fc-0e0299571336" containerName="extract" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290326 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerName="util" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290331 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerName="util" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290340 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c6717e-7c30-473f-84fc-0e0299571336" containerName="util" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290344 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c6717e-7c30-473f-84fc-0e0299571336" containerName="util" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290351 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerName="pull" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290356 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerName="pull" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290366 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerName="extract" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290371 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerName="extract" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290416 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8993fa0-0c64-46e7-ae72-46b8769c2a85" containerName="extract" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290426 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="402ddad4-693f-4d7d-8a1b-dd7e80ecb103" containerName="extract" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290433 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ced8e25-b230-4af8-84e9-bfdad496ff96" containerName="extract" Apr 23 18:18:07.290705 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.290439 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5c6717e-7c30-473f-84fc-0e0299571336" containerName="extract" Apr 23 18:18:07.296760 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.296740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" Apr 23 18:18:07.299889 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.299859 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 18:18:07.300031 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.300008 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-bklbk\"" Apr 23 18:18:07.300156 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.300131 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 18:18:07.303223 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.303183 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476"] Apr 23 18:18:07.462302 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.462265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbts8\" (UniqueName: \"kubernetes.io/projected/3ea44f52-5a26-411a-83ea-6feb5e67c9fb-kube-api-access-lbts8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-l5476\" (UID: \"3ea44f52-5a26-411a-83ea-6feb5e67c9fb\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" Apr 23 18:18:07.563275 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.563242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbts8\" (UniqueName: \"kubernetes.io/projected/3ea44f52-5a26-411a-83ea-6feb5e67c9fb-kube-api-access-lbts8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-l5476\" (UID: \"3ea44f52-5a26-411a-83ea-6feb5e67c9fb\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" Apr 23 18:18:07.572824 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.572788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbts8\" (UniqueName: \"kubernetes.io/projected/3ea44f52-5a26-411a-83ea-6feb5e67c9fb-kube-api-access-lbts8\") pod \"limitador-operator-controller-manager-c7fb4c8d5-l5476\" (UID: \"3ea44f52-5a26-411a-83ea-6feb5e67c9fb\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" Apr 23 18:18:07.608314 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.608284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" Apr 23 18:18:07.734430 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.734405 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476"] Apr 23 18:18:07.736145 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:18:07.736114 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ea44f52_5a26_411a_83ea_6feb5e67c9fb.slice/crio-da4263f9c02e70173b11eb82bcb1d113e651fb51ae1122fa97351861270733a2 WatchSource:0}: Error finding container da4263f9c02e70173b11eb82bcb1d113e651fb51ae1122fa97351861270733a2: Status 404 returned error can't find the container with id da4263f9c02e70173b11eb82bcb1d113e651fb51ae1122fa97351861270733a2 Apr 23 18:18:07.960847 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:07.960761 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" event={"ID":"3ea44f52-5a26-411a-83ea-6feb5e67c9fb","Type":"ContainerStarted","Data":"da4263f9c02e70173b11eb82bcb1d113e651fb51ae1122fa97351861270733a2"} Apr 23 18:18:09.971481 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:09.971447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" event={"ID":"3ea44f52-5a26-411a-83ea-6feb5e67c9fb","Type":"ContainerStarted","Data":"a6946dae7693fa487cc95d36bac91f366b4893e9d36a6c799d58db11b21a15d3"} Apr 23 18:18:09.971874 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:09.971586 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" Apr 23 18:18:09.996538 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:09.996486 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" podStartSLOduration=0.875111756 podStartE2EDuration="2.996472331s" podCreationTimestamp="2026-04-23 18:18:07 +0000 UTC" firstStartedPulling="2026-04-23 18:18:07.738224023 +0000 UTC m=+2195.668236637" lastFinishedPulling="2026-04-23 18:18:09.859584599 +0000 UTC m=+2197.789597212" observedRunningTime="2026-04-23 18:18:09.993209131 +0000 UTC m=+2197.923221780" watchObservedRunningTime="2026-04-23 18:18:09.996472331 +0000 UTC m=+2197.926484966" Apr 23 18:18:13.237982 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.237948 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-hnn8l"] Apr 23 18:18:13.240415 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.240394 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" Apr 23 18:18:13.243471 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.243452 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-p87gf\"" Apr 23 18:18:13.251216 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.251192 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-hnn8l"] Apr 23 18:18:13.417781 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.417738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw8k7\" (UniqueName: \"kubernetes.io/projected/7e2d9c03-3a31-4a7d-89b2-91f44d29bc91-kube-api-access-qw8k7\") pod \"authorino-operator-7587b89b76-hnn8l\" (UID: \"7e2d9c03-3a31-4a7d-89b2-91f44d29bc91\") " pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" Apr 23 18:18:13.518420 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.518337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw8k7\" (UniqueName: \"kubernetes.io/projected/7e2d9c03-3a31-4a7d-89b2-91f44d29bc91-kube-api-access-qw8k7\") pod \"authorino-operator-7587b89b76-hnn8l\" (UID: \"7e2d9c03-3a31-4a7d-89b2-91f44d29bc91\") " pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" Apr 23 18:18:13.541871 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.541844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw8k7\" (UniqueName: \"kubernetes.io/projected/7e2d9c03-3a31-4a7d-89b2-91f44d29bc91-kube-api-access-qw8k7\") pod \"authorino-operator-7587b89b76-hnn8l\" (UID: \"7e2d9c03-3a31-4a7d-89b2-91f44d29bc91\") " pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" Apr 23 18:18:13.550631 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.550606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" Apr 23 18:18:13.684576 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.684547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-hnn8l"] Apr 23 18:18:13.685595 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:18:13.685558 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e2d9c03_3a31_4a7d_89b2_91f44d29bc91.slice/crio-91e9d3331393aa5c120d04e297a3936d16835aaf6333faa2d51b6fb5e7e3433c WatchSource:0}: Error finding container 91e9d3331393aa5c120d04e297a3936d16835aaf6333faa2d51b6fb5e7e3433c: Status 404 returned error can't find the container with id 91e9d3331393aa5c120d04e297a3936d16835aaf6333faa2d51b6fb5e7e3433c Apr 23 18:18:13.987544 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:13.987503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" event={"ID":"7e2d9c03-3a31-4a7d-89b2-91f44d29bc91","Type":"ContainerStarted","Data":"91e9d3331393aa5c120d04e297a3936d16835aaf6333faa2d51b6fb5e7e3433c"} Apr 23 18:18:15.996831 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:15.996788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" event={"ID":"7e2d9c03-3a31-4a7d-89b2-91f44d29bc91","Type":"ContainerStarted","Data":"4ab214fda5c35b9a62f85b64a0371382fcd3462b18349c7f351c099dec13d793"} Apr 23 18:18:15.997216 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:15.996847 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" Apr 23 18:18:16.026421 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:16.026365 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" podStartSLOduration=1.697415644 podStartE2EDuration="3.02635004s" podCreationTimestamp="2026-04-23 18:18:13 +0000 UTC" firstStartedPulling="2026-04-23 18:18:13.687849629 +0000 UTC m=+2201.617862251" lastFinishedPulling="2026-04-23 18:18:15.016784033 +0000 UTC m=+2202.946796647" observedRunningTime="2026-04-23 18:18:16.024589781 +0000 UTC m=+2203.954602423" watchObservedRunningTime="2026-04-23 18:18:16.02635004 +0000 UTC m=+2203.956362676" Apr 23 18:18:20.977413 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:20.977376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-l5476" Apr 23 18:18:27.002838 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:18:27.002805 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-hnn8l" Apr 23 18:19:02.575670 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:02.575589 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-qkxtz"] Apr 23 18:19:02.580330 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:02.580308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" Apr 23 18:19:02.583246 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:02.583225 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-vnrmc\"" Apr 23 18:19:02.589065 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:02.589030 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-qkxtz"] Apr 23 18:19:02.738568 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:02.738528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2k77\" (UniqueName: \"kubernetes.io/projected/7c66ddf7-86cc-41bd-83e6-b251b1fff8a3-kube-api-access-v2k77\") pod \"authorino-79cbc94b89-qkxtz\" (UID: \"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3\") " pod="kuadrant-system/authorino-79cbc94b89-qkxtz" Apr 23 18:19:02.839565 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:02.839468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2k77\" (UniqueName: \"kubernetes.io/projected/7c66ddf7-86cc-41bd-83e6-b251b1fff8a3-kube-api-access-v2k77\") pod \"authorino-79cbc94b89-qkxtz\" (UID: \"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3\") " pod="kuadrant-system/authorino-79cbc94b89-qkxtz" Apr 23 18:19:02.848606 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:02.848576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2k77\" (UniqueName: \"kubernetes.io/projected/7c66ddf7-86cc-41bd-83e6-b251b1fff8a3-kube-api-access-v2k77\") pod \"authorino-79cbc94b89-qkxtz\" (UID: \"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3\") " pod="kuadrant-system/authorino-79cbc94b89-qkxtz" Apr 23 18:19:02.890610 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:02.890574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" Apr 23 18:19:03.027535 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:03.027504 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-qkxtz"] Apr 23 18:19:03.030345 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:19:03.030314 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c66ddf7_86cc_41bd_83e6_b251b1fff8a3.slice/crio-4a7c1597d45528dd8276906bd83c16580b36873740c9470bf6f998e2b74f2c43 WatchSource:0}: Error finding container 4a7c1597d45528dd8276906bd83c16580b36873740c9470bf6f998e2b74f2c43: Status 404 returned error can't find the container with id 4a7c1597d45528dd8276906bd83c16580b36873740c9470bf6f998e2b74f2c43 Apr 23 18:19:03.186884 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:03.186798 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" event={"ID":"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3","Type":"ContainerStarted","Data":"4a7c1597d45528dd8276906bd83c16580b36873740c9470bf6f998e2b74f2c43"} Apr 23 18:19:06.201517 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:06.201480 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" event={"ID":"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3","Type":"ContainerStarted","Data":"5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61"} Apr 23 18:19:06.220145 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:06.220088 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" podStartSLOduration=1.523603472 podStartE2EDuration="4.220071707s" podCreationTimestamp="2026-04-23 18:19:02 +0000 UTC" firstStartedPulling="2026-04-23 18:19:03.031596737 +0000 UTC m=+2250.961609351" lastFinishedPulling="2026-04-23 18:19:05.728064957 +0000 UTC m=+2253.658077586" observedRunningTime="2026-04-23 18:19:06.219184574 +0000 UTC m=+2254.149197210" watchObservedRunningTime="2026-04-23 18:19:06.220071707 +0000 UTC m=+2254.150084345" Apr 23 18:19:25.928833 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:25.928796 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-qkxtz"] Apr 23 18:19:25.929296 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:25.929064 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" podUID="7c66ddf7-86cc-41bd-83e6-b251b1fff8a3" containerName="authorino" containerID="cri-o://5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61" gracePeriod=30 Apr 23 18:19:26.168766 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.168713 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" Apr 23 18:19:26.238273 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.238195 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2k77\" (UniqueName: \"kubernetes.io/projected/7c66ddf7-86cc-41bd-83e6-b251b1fff8a3-kube-api-access-v2k77\") pod \"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3\" (UID: \"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3\") " Apr 23 18:19:26.240381 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.240347 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c66ddf7-86cc-41bd-83e6-b251b1fff8a3-kube-api-access-v2k77" (OuterVolumeSpecName: "kube-api-access-v2k77") pod "7c66ddf7-86cc-41bd-83e6-b251b1fff8a3" (UID: "7c66ddf7-86cc-41bd-83e6-b251b1fff8a3"). InnerVolumeSpecName "kube-api-access-v2k77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:19:26.276974 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.276943 2576 generic.go:358] "Generic (PLEG): container finished" podID="7c66ddf7-86cc-41bd-83e6-b251b1fff8a3" containerID="5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61" exitCode=0 Apr 23 18:19:26.277114 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.276995 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" Apr 23 18:19:26.277114 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.277018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" event={"ID":"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3","Type":"ContainerDied","Data":"5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61"} Apr 23 18:19:26.277114 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.277057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-qkxtz" event={"ID":"7c66ddf7-86cc-41bd-83e6-b251b1fff8a3","Type":"ContainerDied","Data":"4a7c1597d45528dd8276906bd83c16580b36873740c9470bf6f998e2b74f2c43"} Apr 23 18:19:26.277114 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.277073 2576 scope.go:117] "RemoveContainer" containerID="5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61" Apr 23 18:19:26.286192 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.286174 2576 scope.go:117] "RemoveContainer" containerID="5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61" Apr 23 18:19:26.286434 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:19:26.286416 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61\": container with ID starting with 5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61 not found: ID does not exist" containerID="5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61" Apr 23 18:19:26.286487 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.286444 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61"} err="failed to get container status \"5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61\": rpc error: code = NotFound desc = could not find container \"5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61\": container with ID starting with 5a4066b09d3d94fc4bf26b4f599e956805bdd0225480df3699fde65038f60b61 not found: ID does not exist" Apr 23 18:19:26.302745 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.302709 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-qkxtz"] Apr 23 18:19:26.313606 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.313584 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-qkxtz"] Apr 23 18:19:26.339469 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.339446 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v2k77\" (UniqueName: \"kubernetes.io/projected/7c66ddf7-86cc-41bd-83e6-b251b1fff8a3-kube-api-access-v2k77\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:19:26.648790 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:26.648754 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c66ddf7-86cc-41bd-83e6-b251b1fff8a3" path="/var/lib/kubelet/pods/7c66ddf7-86cc-41bd-83e6-b251b1fff8a3/volumes" Apr 23 18:19:34.481083 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.481051 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp"] Apr 23 18:19:34.481485 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.481444 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c66ddf7-86cc-41bd-83e6-b251b1fff8a3" containerName="authorino" Apr 23 18:19:34.481485 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.481459 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c66ddf7-86cc-41bd-83e6-b251b1fff8a3" containerName="authorino" Apr 23 18:19:34.481559 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.481517 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c66ddf7-86cc-41bd-83e6-b251b1fff8a3" containerName="authorino" Apr 23 18:19:34.485082 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.485055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.500357 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.500325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp"] Apr 23 18:19:34.614386 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.614343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.614560 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.614407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.614560 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.614515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.614560 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.614550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.614661 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.614568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/def6ef66-30a7-4f7f-a2d2-1f9921020679-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.614661 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.614601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pcq\" (UniqueName: \"kubernetes.io/projected/def6ef66-30a7-4f7f-a2d2-1f9921020679-kube-api-access-j4pcq\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.614661 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.614643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.715120 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.715068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.715318 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.715168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.715318 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.715235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.715441 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.715341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.715441 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.715367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.715441 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.715396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/def6ef66-30a7-4f7f-a2d2-1f9921020679-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.715441 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.715426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pcq\" (UniqueName: \"kubernetes.io/projected/def6ef66-30a7-4f7f-a2d2-1f9921020679-kube-api-access-j4pcq\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.715989 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.715954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.717844 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.717814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.717948 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.717897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/def6ef66-30a7-4f7f-a2d2-1f9921020679-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.718006 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.717968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.718119 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.718102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.728039 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.728015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/def6ef66-30a7-4f7f-a2d2-1f9921020679-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.730171 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.730142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pcq\" (UniqueName: \"kubernetes.io/projected/def6ef66-30a7-4f7f-a2d2-1f9921020679-kube-api-access-j4pcq\") pod \"istiod-openshift-gateway-55ff986f96-fjvbp\" (UID: \"def6ef66-30a7-4f7f-a2d2-1f9921020679\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.794701 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.794610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:34.942063 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.942029 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp"] Apr 23 18:19:34.942879 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:19:34.942853 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef6ef66_30a7_4f7f_a2d2_1f9921020679.slice/crio-2a0c9c1e3fc7e249a09b1044ba8aad3d967e3c0a80f7ca40bf5da0e23a1c2660 WatchSource:0}: Error finding container 2a0c9c1e3fc7e249a09b1044ba8aad3d967e3c0a80f7ca40bf5da0e23a1c2660: Status 404 returned error can't find the container with id 2a0c9c1e3fc7e249a09b1044ba8aad3d967e3c0a80f7ca40bf5da0e23a1c2660 Apr 23 18:19:34.944919 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.944890 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:19:34.945015 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:34.944947 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:19:35.323027 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:35.322984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" event={"ID":"def6ef66-30a7-4f7f-a2d2-1f9921020679","Type":"ContainerStarted","Data":"47852cc12eca27def5084552f12f8a7c7cc7e11dbcf511567185e5e79d4c798e"} Apr 23 18:19:35.323027 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:35.323026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" event={"ID":"def6ef66-30a7-4f7f-a2d2-1f9921020679","Type":"ContainerStarted","Data":"2a0c9c1e3fc7e249a09b1044ba8aad3d967e3c0a80f7ca40bf5da0e23a1c2660"} Apr 23 18:19:35.323292 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:35.323145 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:35.375899 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:35.375838 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" podStartSLOduration=1.375788528 podStartE2EDuration="1.375788528s" podCreationTimestamp="2026-04-23 18:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:19:35.374119083 +0000 UTC m=+2283.304131719" watchObservedRunningTime="2026-04-23 18:19:35.375788528 +0000 UTC m=+2283.305801163" Apr 23 18:19:36.329806 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.329779 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-fjvbp" Apr 23 18:19:36.453781 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.453746 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj"] Apr 23 18:19:36.454002 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.453981 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" podUID="8bded537-e03b-4ef3-8d6a-a5db985fdedb" containerName="discovery" containerID="cri-o://38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189" gracePeriod=30 Apr 23 18:19:36.710756 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.710712 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:19:36.834027 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.833996 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-token\") pod \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " Apr 23 18:19:36.834200 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.834070 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-ca-configmap\") pod \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " Apr 23 18:19:36.834200 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.834096 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-cacerts\") pod \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " Apr 23 18:19:36.834200 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.834113 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8bded537-e03b-4ef3-8d6a-a5db985fdedb-local-certs\") pod \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " Apr 23 18:19:36.834200 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.834165 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-dns-cert\") pod \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " Apr 23 18:19:36.834484 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.834200 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-kubeconfig\") pod \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " Apr 23 18:19:36.834484 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.834234 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jmrg\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-kube-api-access-4jmrg\") pod \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\" (UID: \"8bded537-e03b-4ef3-8d6a-a5db985fdedb\") " Apr 23 18:19:36.834927 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.834554 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "8bded537-e03b-4ef3-8d6a-a5db985fdedb" (UID: "8bded537-e03b-4ef3-8d6a-a5db985fdedb"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:19:36.836939 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.836875 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-cacerts" (OuterVolumeSpecName: "cacerts") pod "8bded537-e03b-4ef3-8d6a-a5db985fdedb" (UID: "8bded537-e03b-4ef3-8d6a-a5db985fdedb"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:19:36.836939 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.836911 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-kube-api-access-4jmrg" (OuterVolumeSpecName: "kube-api-access-4jmrg") pod "8bded537-e03b-4ef3-8d6a-a5db985fdedb" (UID: "8bded537-e03b-4ef3-8d6a-a5db985fdedb"). InnerVolumeSpecName "kube-api-access-4jmrg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:19:36.837055 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.836959 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "8bded537-e03b-4ef3-8d6a-a5db985fdedb" (UID: "8bded537-e03b-4ef3-8d6a-a5db985fdedb"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:19:36.837055 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.836963 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bded537-e03b-4ef3-8d6a-a5db985fdedb-local-certs" (OuterVolumeSpecName: "local-certs") pod "8bded537-e03b-4ef3-8d6a-a5db985fdedb" (UID: "8bded537-e03b-4ef3-8d6a-a5db985fdedb"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:19:36.837055 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.836973 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "8bded537-e03b-4ef3-8d6a-a5db985fdedb" (UID: "8bded537-e03b-4ef3-8d6a-a5db985fdedb"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:19:36.837055 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.836962 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-token" (OuterVolumeSpecName: "istio-token") pod "8bded537-e03b-4ef3-8d6a-a5db985fdedb" (UID: "8bded537-e03b-4ef3-8d6a-a5db985fdedb"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:19:36.935463 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.935380 2576 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-ca-configmap\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:19:36.935463 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.935410 2576 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-cacerts\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:19:36.935463 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.935420 2576 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8bded537-e03b-4ef3-8d6a-a5db985fdedb-local-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:19:36.935463 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.935431 2576 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-csr-dns-cert\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:19:36.935463 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.935439 2576 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-kubeconfig\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:19:36.935463 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.935447 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4jmrg\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-kube-api-access-4jmrg\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:19:36.935463 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:36.935455 2576 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8bded537-e03b-4ef3-8d6a-a5db985fdedb-istio-token\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:19:37.332325 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.332289 2576 generic.go:358] "Generic (PLEG): container finished" podID="8bded537-e03b-4ef3-8d6a-a5db985fdedb" containerID="38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189" exitCode=0 Apr 23 18:19:37.332760 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.332357 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" Apr 23 18:19:37.332760 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.332371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" event={"ID":"8bded537-e03b-4ef3-8d6a-a5db985fdedb","Type":"ContainerDied","Data":"38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189"} Apr 23 18:19:37.332760 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.332411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj" event={"ID":"8bded537-e03b-4ef3-8d6a-a5db985fdedb","Type":"ContainerDied","Data":"fe14fa4beaac4055a4fef9a5168543df728711cd11253546ee9ec2ae730bd4ce"} Apr 23 18:19:37.332760 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.332427 2576 scope.go:117] "RemoveContainer" containerID="38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189" Apr 23 18:19:37.341976 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.341950 2576 scope.go:117] "RemoveContainer" containerID="38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189" Apr 23 18:19:37.342258 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:19:37.342237 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189\": container with ID starting with 38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189 not found: ID does not exist" containerID="38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189" Apr 23 18:19:37.342307 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.342269 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189"} err="failed to get container status \"38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189\": rpc error: code = NotFound desc = could not find container \"38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189\": container with ID starting with 38d8a935d47b90c28044df5f4154894986dd245492f12f2c89737ed4bf177189 not found: ID does not exist" Apr 23 18:19:37.365839 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.365806 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj"] Apr 23 18:19:37.370770 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:37.370744 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-h4tbj"] Apr 23 18:19:38.647955 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:38.647922 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bded537-e03b-4ef3-8d6a-a5db985fdedb" path="/var/lib/kubelet/pods/8bded537-e03b-4ef3-8d6a-a5db985fdedb/volumes" Apr 23 18:19:44.798047 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.798016 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-4wkm2"] Apr 23 18:19:44.798437 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.798413 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bded537-e03b-4ef3-8d6a-a5db985fdedb" containerName="discovery" Apr 23 18:19:44.798437 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.798431 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bded537-e03b-4ef3-8d6a-a5db985fdedb" containerName="discovery" Apr 23 18:19:44.798584 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.798567 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bded537-e03b-4ef3-8d6a-a5db985fdedb" containerName="discovery" Apr 23 18:19:44.802954 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.802936 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:44.806363 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.806340 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 18:19:44.806363 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.806340 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 18:19:44.806522 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.806376 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 18:19:44.807511 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.807494 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9bvv5\"" Apr 23 18:19:44.811485 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.811458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-4wkm2"] Apr 23 18:19:44.905848 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.905808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dad3b43a-1c3d-4504-b2ad-00d3721005ac-data\") pod \"seaweedfs-86cc847c5c-4wkm2\" (UID: \"dad3b43a-1c3d-4504-b2ad-00d3721005ac\") " pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:44.906021 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:44.905886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9chs\" (UniqueName: \"kubernetes.io/projected/dad3b43a-1c3d-4504-b2ad-00d3721005ac-kube-api-access-q9chs\") pod \"seaweedfs-86cc847c5c-4wkm2\" (UID: \"dad3b43a-1c3d-4504-b2ad-00d3721005ac\") " pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:45.007441 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:45.007403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dad3b43a-1c3d-4504-b2ad-00d3721005ac-data\") pod \"seaweedfs-86cc847c5c-4wkm2\" (UID: \"dad3b43a-1c3d-4504-b2ad-00d3721005ac\") " pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:45.007589 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:45.007455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9chs\" (UniqueName: \"kubernetes.io/projected/dad3b43a-1c3d-4504-b2ad-00d3721005ac-kube-api-access-q9chs\") pod \"seaweedfs-86cc847c5c-4wkm2\" (UID: \"dad3b43a-1c3d-4504-b2ad-00d3721005ac\") " pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:45.007864 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:45.007844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dad3b43a-1c3d-4504-b2ad-00d3721005ac-data\") pod \"seaweedfs-86cc847c5c-4wkm2\" (UID: \"dad3b43a-1c3d-4504-b2ad-00d3721005ac\") " pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:45.016754 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:45.016692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9chs\" (UniqueName: \"kubernetes.io/projected/dad3b43a-1c3d-4504-b2ad-00d3721005ac-kube-api-access-q9chs\") pod \"seaweedfs-86cc847c5c-4wkm2\" (UID: \"dad3b43a-1c3d-4504-b2ad-00d3721005ac\") " pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:45.114894 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:45.114863 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:45.236922 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:45.236895 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-4wkm2"] Apr 23 18:19:45.239328 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:19:45.239293 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddad3b43a_1c3d_4504_b2ad_00d3721005ac.slice/crio-81219dd4ce23204048f1282d104f78e35a12b0175fecb27e0fdc0ca153ce09c2 WatchSource:0}: Error finding container 81219dd4ce23204048f1282d104f78e35a12b0175fecb27e0fdc0ca153ce09c2: Status 404 returned error can't find the container with id 81219dd4ce23204048f1282d104f78e35a12b0175fecb27e0fdc0ca153ce09c2 Apr 23 18:19:45.370551 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:45.370460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-4wkm2" event={"ID":"dad3b43a-1c3d-4504-b2ad-00d3721005ac","Type":"ContainerStarted","Data":"81219dd4ce23204048f1282d104f78e35a12b0175fecb27e0fdc0ca153ce09c2"} Apr 23 18:19:48.387174 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:48.387127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-4wkm2" event={"ID":"dad3b43a-1c3d-4504-b2ad-00d3721005ac","Type":"ContainerStarted","Data":"015446c1c79ccbadae6110dd77a2f519394ef9c317a3d92f244e54bd32b2cb95"} Apr 23 18:19:48.387752 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:48.387315 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:19:48.407066 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:48.407007 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-4wkm2" podStartSLOduration=1.981705552 podStartE2EDuration="4.406990693s" podCreationTimestamp="2026-04-23 18:19:44 +0000 UTC" firstStartedPulling="2026-04-23 18:19:45.240552423 +0000 UTC m=+2293.170565037" lastFinishedPulling="2026-04-23 18:19:47.665837548 +0000 UTC m=+2295.595850178" observedRunningTime="2026-04-23 18:19:48.405208625 +0000 UTC m=+2296.335221261" watchObservedRunningTime="2026-04-23 18:19:48.406990693 +0000 UTC m=+2296.337003329" Apr 23 18:19:54.394314 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:19:54.394277 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-4wkm2" Apr 23 18:20:56.184790 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.184760 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-tv9wm"] Apr 23 18:20:56.188207 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.188180 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:56.190995 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.190973 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 18:20:56.191155 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.191137 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-df4hl\"" Apr 23 18:20:56.200167 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.200137 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tv9wm"] Apr 23 18:20:56.307553 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.307514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs\") pod \"model-serving-api-86f7b4b499-tv9wm\" (UID: \"25b5f9c2-2f39-43fc-b53e-66bf6bffebea\") " pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:56.307786 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.307561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcd8f\" (UniqueName: \"kubernetes.io/projected/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-kube-api-access-dcd8f\") pod \"model-serving-api-86f7b4b499-tv9wm\" (UID: \"25b5f9c2-2f39-43fc-b53e-66bf6bffebea\") " pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:56.408460 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.408417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs\") pod \"model-serving-api-86f7b4b499-tv9wm\" (UID: \"25b5f9c2-2f39-43fc-b53e-66bf6bffebea\") " pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:56.408460 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.408463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcd8f\" (UniqueName: \"kubernetes.io/projected/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-kube-api-access-dcd8f\") pod \"model-serving-api-86f7b4b499-tv9wm\" (UID: \"25b5f9c2-2f39-43fc-b53e-66bf6bffebea\") " pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:56.408712 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:20:56.408571 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 23 18:20:56.408712 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:20:56.408645 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs podName:25b5f9c2-2f39-43fc-b53e-66bf6bffebea nodeName:}" failed. No retries permitted until 2026-04-23 18:20:56.90862652 +0000 UTC m=+2364.838639135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs") pod "model-serving-api-86f7b4b499-tv9wm" (UID: "25b5f9c2-2f39-43fc-b53e-66bf6bffebea") : secret "model-serving-api-tls" not found Apr 23 18:20:56.420247 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.420220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcd8f\" (UniqueName: \"kubernetes.io/projected/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-kube-api-access-dcd8f\") pod \"model-serving-api-86f7b4b499-tv9wm\" (UID: \"25b5f9c2-2f39-43fc-b53e-66bf6bffebea\") " pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:56.913412 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:56.913373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs\") pod \"model-serving-api-86f7b4b499-tv9wm\" (UID: \"25b5f9c2-2f39-43fc-b53e-66bf6bffebea\") " pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:56.913620 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:20:56.913566 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 23 18:20:56.913688 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:20:56.913659 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs podName:25b5f9c2-2f39-43fc-b53e-66bf6bffebea nodeName:}" failed. No retries permitted until 2026-04-23 18:20:57.913637928 +0000 UTC m=+2365.843650547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs") pod "model-serving-api-86f7b4b499-tv9wm" (UID: "25b5f9c2-2f39-43fc-b53e-66bf6bffebea") : secret "model-serving-api-tls" not found Apr 23 18:20:57.920892 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:57.920861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs\") pod \"model-serving-api-86f7b4b499-tv9wm\" (UID: \"25b5f9c2-2f39-43fc-b53e-66bf6bffebea\") " pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:57.923319 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:57.923290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25b5f9c2-2f39-43fc-b53e-66bf6bffebea-tls-certs\") pod \"model-serving-api-86f7b4b499-tv9wm\" (UID: \"25b5f9c2-2f39-43fc-b53e-66bf6bffebea\") " pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:58.000676 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:58.000634 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:20:58.127130 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:58.127104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tv9wm"] Apr 23 18:20:58.129698 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:20:58.129670 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b5f9c2_2f39_43fc_b53e_66bf6bffebea.slice/crio-048359ad98e1592c5787e18e03633058462f8ac624972fa0cbdb31d4754e97a8 WatchSource:0}: Error finding container 048359ad98e1592c5787e18e03633058462f8ac624972fa0cbdb31d4754e97a8: Status 404 returned error can't find the container with id 048359ad98e1592c5787e18e03633058462f8ac624972fa0cbdb31d4754e97a8 Apr 23 18:20:58.131481 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:58.131461 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:20:58.681411 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:20:58.681367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tv9wm" event={"ID":"25b5f9c2-2f39-43fc-b53e-66bf6bffebea","Type":"ContainerStarted","Data":"048359ad98e1592c5787e18e03633058462f8ac624972fa0cbdb31d4754e97a8"} Apr 23 18:21:00.698816 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:00.698765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tv9wm" event={"ID":"25b5f9c2-2f39-43fc-b53e-66bf6bffebea","Type":"ContainerStarted","Data":"63cd15f3894bb42920d11cc0e18aa70c76691f935b1f0a10eb52c46b67f014a2"} Apr 23 18:21:00.699301 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:00.698876 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:21:00.719204 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:00.719148 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-tv9wm" podStartSLOduration=2.592601494 podStartE2EDuration="4.719132625s" podCreationTimestamp="2026-04-23 18:20:56 +0000 UTC" firstStartedPulling="2026-04-23 18:20:58.131617266 +0000 UTC m=+2366.061629879" lastFinishedPulling="2026-04-23 18:21:00.258148393 +0000 UTC m=+2368.188161010" observedRunningTime="2026-04-23 18:21:00.717215546 +0000 UTC m=+2368.647228181" watchObservedRunningTime="2026-04-23 18:21:00.719132625 +0000 UTC m=+2368.649145262" Apr 23 18:21:11.210624 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.210585 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-jcm8g"] Apr 23 18:21:11.214302 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.214281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jcm8g" Apr 23 18:21:11.222069 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.222046 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jcm8g"] Apr 23 18:21:11.344519 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.344475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knjbk\" (UniqueName: \"kubernetes.io/projected/149abbbd-9ce8-4595-b1fa-9bf891e9d038-kube-api-access-knjbk\") pod \"s3-init-jcm8g\" (UID: \"149abbbd-9ce8-4595-b1fa-9bf891e9d038\") " pod="kserve/s3-init-jcm8g" Apr 23 18:21:11.445324 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.445288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knjbk\" (UniqueName: \"kubernetes.io/projected/149abbbd-9ce8-4595-b1fa-9bf891e9d038-kube-api-access-knjbk\") pod \"s3-init-jcm8g\" (UID: \"149abbbd-9ce8-4595-b1fa-9bf891e9d038\") " pod="kserve/s3-init-jcm8g" Apr 23 18:21:11.455043 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.455017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knjbk\" (UniqueName: \"kubernetes.io/projected/149abbbd-9ce8-4595-b1fa-9bf891e9d038-kube-api-access-knjbk\") pod \"s3-init-jcm8g\" (UID: \"149abbbd-9ce8-4595-b1fa-9bf891e9d038\") " pod="kserve/s3-init-jcm8g" Apr 23 18:21:11.524285 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.524203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jcm8g" Apr 23 18:21:11.653684 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.653655 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jcm8g"] Apr 23 18:21:11.655298 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:21:11.655265 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod149abbbd_9ce8_4595_b1fa_9bf891e9d038.slice/crio-bccc7fe1666905c6ca3b8698b84946c944ef3478b8c8d7cc71c053276a15682f WatchSource:0}: Error finding container bccc7fe1666905c6ca3b8698b84946c944ef3478b8c8d7cc71c053276a15682f: Status 404 returned error can't find the container with id bccc7fe1666905c6ca3b8698b84946c944ef3478b8c8d7cc71c053276a15682f Apr 23 18:21:11.706712 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.706682 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-tv9wm" Apr 23 18:21:11.744609 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:11.744574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jcm8g" event={"ID":"149abbbd-9ce8-4595-b1fa-9bf891e9d038","Type":"ContainerStarted","Data":"bccc7fe1666905c6ca3b8698b84946c944ef3478b8c8d7cc71c053276a15682f"} Apr 23 18:21:16.767949 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:16.767908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jcm8g" event={"ID":"149abbbd-9ce8-4595-b1fa-9bf891e9d038","Type":"ContainerStarted","Data":"bfe52e40024876327cbab782538da0c8d1ab1e0d773079fcaecc25cc72b64d1f"} Apr 23 18:21:16.787535 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:16.787471 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-jcm8g" podStartSLOduration=1.373595003 podStartE2EDuration="5.787450996s" podCreationTimestamp="2026-04-23 18:21:11 +0000 UTC" firstStartedPulling="2026-04-23 18:21:11.657111859 +0000 UTC m=+2379.587124473" lastFinishedPulling="2026-04-23 18:21:16.070967853 +0000 UTC m=+2384.000980466" observedRunningTime="2026-04-23 18:21:16.787227149 +0000 UTC m=+2384.717239786" watchObservedRunningTime="2026-04-23 18:21:16.787450996 +0000 UTC m=+2384.717463625" Apr 23 18:21:19.780737 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:19.780682 2576 generic.go:358] "Generic (PLEG): container finished" podID="149abbbd-9ce8-4595-b1fa-9bf891e9d038" containerID="bfe52e40024876327cbab782538da0c8d1ab1e0d773079fcaecc25cc72b64d1f" exitCode=0 Apr 23 18:21:19.781105 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:19.780779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jcm8g" event={"ID":"149abbbd-9ce8-4595-b1fa-9bf891e9d038","Type":"ContainerDied","Data":"bfe52e40024876327cbab782538da0c8d1ab1e0d773079fcaecc25cc72b64d1f"} Apr 23 18:21:20.919037 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:20.919013 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jcm8g" Apr 23 18:21:21.031687 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:21.031644 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knjbk\" (UniqueName: \"kubernetes.io/projected/149abbbd-9ce8-4595-b1fa-9bf891e9d038-kube-api-access-knjbk\") pod \"149abbbd-9ce8-4595-b1fa-9bf891e9d038\" (UID: \"149abbbd-9ce8-4595-b1fa-9bf891e9d038\") " Apr 23 18:21:21.033865 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:21.033840 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149abbbd-9ce8-4595-b1fa-9bf891e9d038-kube-api-access-knjbk" (OuterVolumeSpecName: "kube-api-access-knjbk") pod "149abbbd-9ce8-4595-b1fa-9bf891e9d038" (UID: "149abbbd-9ce8-4595-b1fa-9bf891e9d038"). InnerVolumeSpecName "kube-api-access-knjbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:21:21.132437 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:21.132393 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-knjbk\" (UniqueName: \"kubernetes.io/projected/149abbbd-9ce8-4595-b1fa-9bf891e9d038-kube-api-access-knjbk\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:21:21.789788 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:21.789752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jcm8g" event={"ID":"149abbbd-9ce8-4595-b1fa-9bf891e9d038","Type":"ContainerDied","Data":"bccc7fe1666905c6ca3b8698b84946c944ef3478b8c8d7cc71c053276a15682f"} Apr 23 18:21:21.789788 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:21.789794 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bccc7fe1666905c6ca3b8698b84946c944ef3478b8c8d7cc71c053276a15682f" Apr 23 18:21:21.789788 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:21.789770 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jcm8g" Apr 23 18:21:31.853056 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.853018 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd"] Apr 23 18:21:31.853612 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.853590 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="149abbbd-9ce8-4595-b1fa-9bf891e9d038" containerName="s3-init" Apr 23 18:21:31.853756 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.853616 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="149abbbd-9ce8-4595-b1fa-9bf891e9d038" containerName="s3-init" Apr 23 18:21:31.853756 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.853701 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="149abbbd-9ce8-4595-b1fa-9bf891e9d038" containerName="s3-init" Apr 23 18:21:31.857991 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.857968 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.860928 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.860900 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:21:31.860928 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.860920 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-4fb42\"" Apr 23 18:21:31.861126 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.860949 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 23 18:21:31.861265 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.861236 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 18:21:31.868550 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.868526 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd"] Apr 23 18:21:31.911162 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.911346 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.911346 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.911346 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.911346 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t78n8\" (UniqueName: \"kubernetes.io/projected/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-kube-api-access-t78n8\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.911555 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.911555 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.911555 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:31.911555 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:31.911484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012547 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012712 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012557 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012712 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012712 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012712 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t78n8\" (UniqueName: \"kubernetes.io/projected/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-kube-api-access-t78n8\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012940 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012940 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012940 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.012940 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.012862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.013112 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.013034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.013112 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.013050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.013380 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.013352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.013516 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.013391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.013586 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.013549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.014967 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.014945 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.015789 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.015770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.021526 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.021494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t78n8\" (UniqueName: \"kubernetes.io/projected/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-kube-api-access-t78n8\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.021761 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.021744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e81f2a0-2be6-48bc-8d7e-e039827fbfd7-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-mqqmd\" (UID: \"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.173553 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.173460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:32.317058 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.317026 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd"] Apr 23 18:21:32.318275 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:21:32.318243 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e81f2a0_2be6_48bc_8d7e_e039827fbfd7.slice/crio-9335fff524bd1b79781234975800be5731bf9c37bec2acb6cc7109f533e15176 WatchSource:0}: Error finding container 9335fff524bd1b79781234975800be5731bf9c37bec2acb6cc7109f533e15176: Status 404 returned error can't find the container with id 9335fff524bd1b79781234975800be5731bf9c37bec2acb6cc7109f533e15176 Apr 23 18:21:32.320465 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.320429 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:21:32.320550 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.320504 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:21:32.320615 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.320547 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 18:21:32.679815 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.679698 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:21:32.679815 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.679751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:21:32.694875 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.683423 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:21:32.694875 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.683532 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:21:32.831397 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.831365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" event={"ID":"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7","Type":"ContainerStarted","Data":"f16663e31de503b0192f8c97df7ff985ed148786dc7e6833b0854c434600045e"} Apr 23 18:21:32.831397 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.831402 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" event={"ID":"6e81f2a0-2be6-48bc-8d7e-e039827fbfd7","Type":"ContainerStarted","Data":"9335fff524bd1b79781234975800be5731bf9c37bec2acb6cc7109f533e15176"} Apr 23 18:21:32.869229 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:32.869177 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" podStartSLOduration=1.869163383 podStartE2EDuration="1.869163383s" podCreationTimestamp="2026-04-23 18:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:21:32.866318945 +0000 UTC m=+2400.796331580" watchObservedRunningTime="2026-04-23 18:21:32.869163383 +0000 UTC m=+2400.799176124" Apr 23 18:21:33.174067 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:33.174029 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:33.179370 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:33.179342 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:33.835325 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:33.835295 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:33.836337 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:33.836318 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-mqqmd" Apr 23 18:21:39.783154 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.783112 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv"] Apr 23 18:21:39.791981 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.791953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.796047 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.796022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 18:21:39.796190 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.796113 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 23 18:21:39.796282 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.796228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-cdlh9\"" Apr 23 18:21:39.798359 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.798331 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv"] Apr 23 18:21:39.875344 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.875313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/194856d7-3e08-4f99-92c1-14c31661166b-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.875344 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.875354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.875558 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.875379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.875558 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.875453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.875558 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.875486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnj69\" (UniqueName: \"kubernetes.io/projected/194856d7-3e08-4f99-92c1-14c31661166b-kube-api-access-hnj69\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.875558 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.875505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.976647 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.976611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.976855 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.976652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnj69\" (UniqueName: \"kubernetes.io/projected/194856d7-3e08-4f99-92c1-14c31661166b-kube-api-access-hnj69\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.976855 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.976677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.976855 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.976820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/194856d7-3e08-4f99-92c1-14c31661166b-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.976855 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.976853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.977057 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.976887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.977163 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.977139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.977224 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.977171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.977224 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.977139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.977224 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.977201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.979564 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.979537 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/194856d7-3e08-4f99-92c1-14c31661166b-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:39.985226 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:39.985208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnj69\" (UniqueName: \"kubernetes.io/projected/194856d7-3e08-4f99-92c1-14c31661166b-kube-api-access-hnj69\") pod \"scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:40.105874 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:40.105827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:21:40.239840 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:40.239815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv"] Apr 23 18:21:40.241620 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:21:40.241591 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194856d7_3e08_4f99_92c1_14c31661166b.slice/crio-493e99690aa4c63d6d2db009edca32f9019fb3bbb47c7279fb697e2c47a1188d WatchSource:0}: Error finding container 493e99690aa4c63d6d2db009edca32f9019fb3bbb47c7279fb697e2c47a1188d: Status 404 returned error can't find the container with id 493e99690aa4c63d6d2db009edca32f9019fb3bbb47c7279fb697e2c47a1188d Apr 23 18:21:40.862099 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:40.862063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" event={"ID":"194856d7-3e08-4f99-92c1-14c31661166b","Type":"ContainerStarted","Data":"493e99690aa4c63d6d2db009edca32f9019fb3bbb47c7279fb697e2c47a1188d"} Apr 23 18:21:44.888323 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:44.888284 2576 generic.go:358] "Generic (PLEG): container finished" podID="194856d7-3e08-4f99-92c1-14c31661166b" containerID="6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297" exitCode=0 Apr 23 18:21:44.888771 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:44.888344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" event={"ID":"194856d7-3e08-4f99-92c1-14c31661166b","Type":"ContainerDied","Data":"6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297"} Apr 23 18:21:46.899910 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:21:46.899869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" event={"ID":"194856d7-3e08-4f99-92c1-14c31661166b","Type":"ContainerStarted","Data":"a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0"} Apr 23 18:22:18.066915 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:22:18.066882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" event={"ID":"194856d7-3e08-4f99-92c1-14c31661166b","Type":"ContainerStarted","Data":"8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e"} Apr 23 18:22:18.067376 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:22:18.067080 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:22:18.069559 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:22:18.069539 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:22:18.094180 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:22:18.094129 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" podStartSLOduration=1.749057973 podStartE2EDuration="39.094114286s" podCreationTimestamp="2026-04-23 18:21:39 +0000 UTC" firstStartedPulling="2026-04-23 18:21:40.243931362 +0000 UTC m=+2408.173943976" lastFinishedPulling="2026-04-23 18:22:17.588987658 +0000 UTC m=+2445.519000289" observedRunningTime="2026-04-23 18:22:18.091033323 +0000 UTC m=+2446.021045961" watchObservedRunningTime="2026-04-23 18:22:18.094114286 +0000 UTC m=+2446.024126923" Apr 23 18:22:20.106546 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:22:20.106508 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:22:20.106546 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:22:20.106554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:22:30.107897 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:22:30.107858 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:22:30.109231 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:22:30.109213 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:26:32.721398 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:26:32.721275 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:26:32.725510 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:26:32.723180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:26:32.725510 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:26:32.724992 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:26:32.726659 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:26:32.726641 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:31:32.761984 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:31:32.761874 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:31:32.766121 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:31:32.765088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:31:32.766121 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:31:32.765313 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:31:32.769039 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:31:32.769010 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:36:32.793328 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:36:32.793222 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:36:32.797360 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:36:32.797216 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:36:32.797939 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:36:32.797918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:36:32.801691 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:36:32.801671 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:38:02.704063 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.704023 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l"] Apr 23 18:38:02.707925 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.707903 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.710825 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.710783 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-4bmqh\"" Apr 23 18:38:02.710951 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.710840 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 23 18:38:02.723439 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.723415 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l"] Apr 23 18:38:02.772169 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.772132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.772169 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.772172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.772649 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.772284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7089632-2275-4236-88ee-572597e89f09-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.772649 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.772347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.772649 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.772367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.772649 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.772388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5cr6\" (UniqueName: \"kubernetes.io/projected/a7089632-2275-4236-88ee-572597e89f09-kube-api-access-p5cr6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.873092 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.873056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7089632-2275-4236-88ee-572597e89f09-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.873297 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.873128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.873297 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.873162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.873297 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.873191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5cr6\" (UniqueName: \"kubernetes.io/projected/a7089632-2275-4236-88ee-572597e89f09-kube-api-access-p5cr6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.873297 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.873219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.873297 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.873250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.874081 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.873918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.874213 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.874089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.874213 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.874119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.874213 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.874200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.877673 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.877131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7089632-2275-4236-88ee-572597e89f09-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:02.884482 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:02.884448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5cr6\" (UniqueName: \"kubernetes.io/projected/a7089632-2275-4236-88ee-572597e89f09-kube-api-access-p5cr6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:03.018379 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:03.018279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:03.147458 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:03.147429 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l"] Apr 23 18:38:03.149607 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:38:03.149573 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7089632_2275_4236_88ee_572597e89f09.slice/crio-3790af7fbe39385db8becc9836abae3e172278a2f9052a7c6271721d4c6fc659 WatchSource:0}: Error finding container 3790af7fbe39385db8becc9836abae3e172278a2f9052a7c6271721d4c6fc659: Status 404 returned error can't find the container with id 3790af7fbe39385db8becc9836abae3e172278a2f9052a7c6271721d4c6fc659 Apr 23 18:38:03.151904 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:03.151885 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:38:03.815159 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:03.815124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" event={"ID":"a7089632-2275-4236-88ee-572597e89f09","Type":"ContainerStarted","Data":"c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792"} Apr 23 18:38:03.815159 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:03.815159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" event={"ID":"a7089632-2275-4236-88ee-572597e89f09","Type":"ContainerStarted","Data":"3790af7fbe39385db8becc9836abae3e172278a2f9052a7c6271721d4c6fc659"} Apr 23 18:38:04.820203 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:04.820163 2576 generic.go:358] "Generic (PLEG): container finished" podID="a7089632-2275-4236-88ee-572597e89f09" containerID="c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792" exitCode=0 Apr 23 18:38:04.820674 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:04.820243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" event={"ID":"a7089632-2275-4236-88ee-572597e89f09","Type":"ContainerDied","Data":"c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792"} Apr 23 18:38:05.827491 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:05.827454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" event={"ID":"a7089632-2275-4236-88ee-572597e89f09","Type":"ContainerStarted","Data":"1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9"} Apr 23 18:38:05.827491 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:05.827494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" event={"ID":"a7089632-2275-4236-88ee-572597e89f09","Type":"ContainerStarted","Data":"98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20"} Apr 23 18:38:05.827916 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:05.827554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:05.853908 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:05.853843 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" podStartSLOduration=3.853822557 podStartE2EDuration="3.853822557s" podCreationTimestamp="2026-04-23 18:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:38:05.851903371 +0000 UTC m=+3393.781916062" watchObservedRunningTime="2026-04-23 18:38:05.853822557 +0000 UTC m=+3393.783835193" Apr 23 18:38:13.018419 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:13.018359 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:13.019070 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:13.018431 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:13.021387 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:13.021360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:13.863307 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:13.863276 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:38:34.867309 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:38:34.867236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:40:56.844938 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:56.844903 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv"] Apr 23 18:40:56.845471 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:56.845280 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="main" containerID="cri-o://a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0" gracePeriod=30 Apr 23 18:40:56.845471 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:56.845345 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="tokenizer" containerID="cri-o://8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e" gracePeriod=30 Apr 23 18:40:57.525776 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:57.525743 2576 generic.go:358] "Generic (PLEG): container finished" podID="194856d7-3e08-4f99-92c1-14c31661166b" containerID="a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0" exitCode=0 Apr 23 18:40:57.525961 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:57.525810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" event={"ID":"194856d7-3e08-4f99-92c1-14c31661166b","Type":"ContainerDied","Data":"a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0"} Apr 23 18:40:58.110462 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.110439 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:40:58.176451 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.176422 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnj69\" (UniqueName: \"kubernetes.io/projected/194856d7-3e08-4f99-92c1-14c31661166b-kube-api-access-hnj69\") pod \"194856d7-3e08-4f99-92c1-14c31661166b\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " Apr 23 18:40:58.176604 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.176463 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-tmp\") pod \"194856d7-3e08-4f99-92c1-14c31661166b\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " Apr 23 18:40:58.176604 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.176493 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-kserve-provision-location\") pod \"194856d7-3e08-4f99-92c1-14c31661166b\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " Apr 23 18:40:58.176604 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.176563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-uds\") pod \"194856d7-3e08-4f99-92c1-14c31661166b\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " Apr 23 18:40:58.176604 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.176595 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/194856d7-3e08-4f99-92c1-14c31661166b-tls-certs\") pod \"194856d7-3e08-4f99-92c1-14c31661166b\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " Apr 23 18:40:58.176822 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.176659 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-cache\") pod \"194856d7-3e08-4f99-92c1-14c31661166b\" (UID: \"194856d7-3e08-4f99-92c1-14c31661166b\") " Apr 23 18:40:58.176927 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.176840 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "194856d7-3e08-4f99-92c1-14c31661166b" (UID: "194856d7-3e08-4f99-92c1-14c31661166b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:58.176927 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.176916 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "194856d7-3e08-4f99-92c1-14c31661166b" (UID: "194856d7-3e08-4f99-92c1-14c31661166b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:58.177090 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.177011 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:40:58.177090 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.177027 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:40:58.177090 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.177054 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "194856d7-3e08-4f99-92c1-14c31661166b" (UID: "194856d7-3e08-4f99-92c1-14c31661166b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:58.177464 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.177438 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "194856d7-3e08-4f99-92c1-14c31661166b" (UID: "194856d7-3e08-4f99-92c1-14c31661166b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:40:58.178874 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.178855 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194856d7-3e08-4f99-92c1-14c31661166b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "194856d7-3e08-4f99-92c1-14c31661166b" (UID: "194856d7-3e08-4f99-92c1-14c31661166b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:40:58.178971 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.178951 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194856d7-3e08-4f99-92c1-14c31661166b-kube-api-access-hnj69" (OuterVolumeSpecName: "kube-api-access-hnj69") pod "194856d7-3e08-4f99-92c1-14c31661166b" (UID: "194856d7-3e08-4f99-92c1-14c31661166b"). InnerVolumeSpecName "kube-api-access-hnj69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:40:58.277865 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.277781 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:40:58.277865 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.277814 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnj69\" (UniqueName: \"kubernetes.io/projected/194856d7-3e08-4f99-92c1-14c31661166b-kube-api-access-hnj69\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:40:58.277865 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.277829 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/194856d7-3e08-4f99-92c1-14c31661166b-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:40:58.277865 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.277841 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/194856d7-3e08-4f99-92c1-14c31661166b-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:40:58.531937 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.531850 2576 generic.go:358] "Generic (PLEG): container finished" podID="194856d7-3e08-4f99-92c1-14c31661166b" containerID="8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e" exitCode=0 Apr 23 18:40:58.531937 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.531921 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" Apr 23 18:40:58.532145 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.531929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" event={"ID":"194856d7-3e08-4f99-92c1-14c31661166b","Type":"ContainerDied","Data":"8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e"} Apr 23 18:40:58.532145 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.531968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" event={"ID":"194856d7-3e08-4f99-92c1-14c31661166b","Type":"ContainerDied","Data":"493e99690aa4c63d6d2db009edca32f9019fb3bbb47c7279fb697e2c47a1188d"} Apr 23 18:40:58.532145 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.531987 2576 scope.go:117] "RemoveContainer" containerID="8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e" Apr 23 18:40:58.543865 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.543845 2576 scope.go:117] "RemoveContainer" containerID="a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0" Apr 23 18:40:58.552299 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.552279 2576 scope.go:117] "RemoveContainer" containerID="6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297" Apr 23 18:40:58.560877 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.560847 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv"] Apr 23 18:40:58.564309 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.564286 2576 scope.go:117] "RemoveContainer" containerID="8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e" Apr 23 18:40:58.564682 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:40:58.564658 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e\": container with ID starting with 8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e not found: ID does not exist" containerID="8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e" Apr 23 18:40:58.564802 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.564694 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e"} err="failed to get container status \"8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e\": rpc error: code = NotFound desc = could not find container \"8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e\": container with ID starting with 8694edc9a560bfb867af5e4ff20c0f57ffcb0f30ed458aa177ccced3bf80c62e not found: ID does not exist" Apr 23 18:40:58.564802 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.564778 2576 scope.go:117] "RemoveContainer" containerID="a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0" Apr 23 18:40:58.565140 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:40:58.565112 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0\": container with ID starting with a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0 not found: ID does not exist" containerID="a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0" Apr 23 18:40:58.565223 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.565161 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0"} err="failed to get container status \"a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0\": rpc error: code = NotFound desc = could not find container \"a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0\": container with ID starting with a64147d0a26b3372dbc1819cb5975258fa9fde3d3ba2ea75f3296df8d77cfdc0 not found: ID does not exist" Apr 23 18:40:58.565223 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.565189 2576 scope.go:117] "RemoveContainer" containerID="6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297" Apr 23 18:40:58.565527 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:40:58.565498 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297\": container with ID starting with 6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297 not found: ID does not exist" containerID="6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297" Apr 23 18:40:58.565600 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.565539 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297"} err="failed to get container status \"6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297\": rpc error: code = NotFound desc = could not find container \"6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297\": container with ID starting with 6554bc6a2d5e27de0e506182dea51b624f4939d0e8a52088bf1183057172b297 not found: ID does not exist" Apr 23 18:40:58.570452 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.570430 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv"] Apr 23 18:40:58.648094 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:58.648055 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194856d7-3e08-4f99-92c1-14c31661166b" path="/var/lib/kubelet/pods/194856d7-3e08-4f99-92c1-14c31661166b/volumes" Apr 23 18:40:59.067990 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:40:59.067945 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-d6db7h4vsv" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.44:9003\" within 1s: context deadline exceeded" Apr 23 18:40:59.068168 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:40:59.068070 2576 logging.go:55] [core] [Channel #785 SubChannel #786]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.44:9003", ServerName: "10.133.0.44:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.44:9003: operation was canceled" Apr 23 18:41:06.347188 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.347149 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs"] Apr 23 18:41:06.347849 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.347827 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="tokenizer" Apr 23 18:41:06.347937 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.347852 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="tokenizer" Apr 23 18:41:06.347937 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.347887 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="storage-initializer" Apr 23 18:41:06.347937 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.347896 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="storage-initializer" Apr 23 18:41:06.347937 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.347916 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="main" Apr 23 18:41:06.347937 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.347924 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="main" Apr 23 18:41:06.348139 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.348004 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="tokenizer" Apr 23 18:41:06.348139 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.348019 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="194856d7-3e08-4f99-92c1-14c31661166b" containerName="main" Apr 23 18:41:06.353473 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.353452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.356695 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.356558 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 23 18:41:06.356695 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.356629 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-9r9s2\"" Apr 23 18:41:06.360913 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.360887 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs"] Apr 23 18:41:06.449102 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.449064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.449102 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.449101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.449342 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.449120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5764fd45-af8b-4311-b966-f79bd178fc4c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.449342 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.449197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.449342 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.449318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.449486 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.449382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nwdz\" (UniqueName: \"kubernetes.io/projected/5764fd45-af8b-4311-b966-f79bd178fc4c-kube-api-access-2nwdz\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550258 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550258 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550479 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5764fd45-af8b-4311-b966-f79bd178fc4c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550479 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550479 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550479 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nwdz\" (UniqueName: \"kubernetes.io/projected/5764fd45-af8b-4311-b966-f79bd178fc4c-kube-api-access-2nwdz\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550735 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550781 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550824 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.550890 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.550866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.552969 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.552936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5764fd45-af8b-4311-b966-f79bd178fc4c-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.558514 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.558492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nwdz\" (UniqueName: \"kubernetes.io/projected/5764fd45-af8b-4311-b966-f79bd178fc4c-kube-api-access-2nwdz\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.665362 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.665275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:06.804844 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:06.804815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs"] Apr 23 18:41:06.806274 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:41:06.806239 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5764fd45_af8b_4311_b966_f79bd178fc4c.slice/crio-c576677019dcf5f16e251bf97cbd3c3eefb7a0ade852027ee9b17cc1a68334ff WatchSource:0}: Error finding container c576677019dcf5f16e251bf97cbd3c3eefb7a0ade852027ee9b17cc1a68334ff: Status 404 returned error can't find the container with id c576677019dcf5f16e251bf97cbd3c3eefb7a0ade852027ee9b17cc1a68334ff Apr 23 18:41:07.568440 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:07.568404 2576 generic.go:358] "Generic (PLEG): container finished" podID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerID="088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8" exitCode=0 Apr 23 18:41:07.568843 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:07.568475 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" event={"ID":"5764fd45-af8b-4311-b966-f79bd178fc4c","Type":"ContainerDied","Data":"088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8"} Apr 23 18:41:07.568843 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:07.568515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" event={"ID":"5764fd45-af8b-4311-b966-f79bd178fc4c","Type":"ContainerStarted","Data":"c576677019dcf5f16e251bf97cbd3c3eefb7a0ade852027ee9b17cc1a68334ff"} Apr 23 18:41:08.574290 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:08.574247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" event={"ID":"5764fd45-af8b-4311-b966-f79bd178fc4c","Type":"ContainerStarted","Data":"47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc"} Apr 23 18:41:08.574290 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:08.574290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" event={"ID":"5764fd45-af8b-4311-b966-f79bd178fc4c","Type":"ContainerStarted","Data":"9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5"} Apr 23 18:41:08.574880 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:08.574349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:08.597335 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:08.597272 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" podStartSLOduration=2.597251661 podStartE2EDuration="2.597251661s" podCreationTimestamp="2026-04-23 18:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:41:08.595614572 +0000 UTC m=+3576.525627208" watchObservedRunningTime="2026-04-23 18:41:08.597251661 +0000 UTC m=+3576.527264311" Apr 23 18:41:16.666268 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:16.666234 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:16.666268 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:16.666274 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:16.669074 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:16.669050 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:17.613891 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:17.613862 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:41:32.827996 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:32.827958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:41:32.832009 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:32.831981 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:41:32.834427 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:32.834402 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:41:32.838604 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:32.838570 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:41:38.617484 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:41:38.617450 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 18:46:32.863993 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:46:32.863887 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:46:32.868443 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:46:32.868420 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:46:32.875944 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:46:32.875922 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:46:32.880003 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:46:32.879983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:51:32.900062 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:51:32.899956 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:51:32.904003 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:51:32.903567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:51:32.913584 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:51:32.913562 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:51:32.916917 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:51:32.916899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:56:32.938140 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:56:32.938032 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:56:32.942835 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:56:32.942798 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:56:32.952032 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:56:32.952003 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 18:56:32.955089 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:56:32.955070 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 18:58:57.644202 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:57.644165 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l"] Apr 23 18:58:57.645178 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:57.644593 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="main" containerID="cri-o://98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20" gracePeriod=30 Apr 23 18:58:57.645178 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:57.644670 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="tokenizer" containerID="cri-o://1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9" gracePeriod=30 Apr 23 18:58:57.898639 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:57.898554 2576 generic.go:358] "Generic (PLEG): container finished" podID="a7089632-2275-4236-88ee-572597e89f09" containerID="98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20" exitCode=0 Apr 23 18:58:57.898639 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:57.898625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" event={"ID":"a7089632-2275-4236-88ee-572597e89f09","Type":"ContainerDied","Data":"98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20"} Apr 23 18:58:58.797088 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.797066 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:58:58.871975 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.871938 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-uds\") pod \"a7089632-2275-4236-88ee-572597e89f09\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " Apr 23 18:58:58.872167 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872047 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-kserve-provision-location\") pod \"a7089632-2275-4236-88ee-572597e89f09\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " Apr 23 18:58:58.872167 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-tmp\") pod \"a7089632-2275-4236-88ee-572597e89f09\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " Apr 23 18:58:58.872167 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872103 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7089632-2275-4236-88ee-572597e89f09-tls-certs\") pod \"a7089632-2275-4236-88ee-572597e89f09\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " Apr 23 18:58:58.872167 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872127 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-cache\") pod \"a7089632-2275-4236-88ee-572597e89f09\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " Apr 23 18:58:58.872375 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872180 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5cr6\" (UniqueName: \"kubernetes.io/projected/a7089632-2275-4236-88ee-572597e89f09-kube-api-access-p5cr6\") pod \"a7089632-2275-4236-88ee-572597e89f09\" (UID: \"a7089632-2275-4236-88ee-572597e89f09\") " Apr 23 18:58:58.872375 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872210 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a7089632-2275-4236-88ee-572597e89f09" (UID: "a7089632-2275-4236-88ee-572597e89f09"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:58:58.872478 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872448 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:58:58.872478 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872452 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a7089632-2275-4236-88ee-572597e89f09" (UID: "a7089632-2275-4236-88ee-572597e89f09"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:58:58.872547 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872468 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a7089632-2275-4236-88ee-572597e89f09" (UID: "a7089632-2275-4236-88ee-572597e89f09"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:58:58.872796 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.872779 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a7089632-2275-4236-88ee-572597e89f09" (UID: "a7089632-2275-4236-88ee-572597e89f09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:58:58.874239 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.874218 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7089632-2275-4236-88ee-572597e89f09-kube-api-access-p5cr6" (OuterVolumeSpecName: "kube-api-access-p5cr6") pod "a7089632-2275-4236-88ee-572597e89f09" (UID: "a7089632-2275-4236-88ee-572597e89f09"). InnerVolumeSpecName "kube-api-access-p5cr6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:58:58.874294 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.874231 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7089632-2275-4236-88ee-572597e89f09-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a7089632-2275-4236-88ee-572597e89f09" (UID: "a7089632-2275-4236-88ee-572597e89f09"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:58:58.904257 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.904227 2576 generic.go:358] "Generic (PLEG): container finished" podID="a7089632-2275-4236-88ee-572597e89f09" containerID="1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9" exitCode=0 Apr 23 18:58:58.904407 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.904300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" event={"ID":"a7089632-2275-4236-88ee-572597e89f09","Type":"ContainerDied","Data":"1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9"} Apr 23 18:58:58.904407 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.904309 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" Apr 23 18:58:58.904407 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.904329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l" event={"ID":"a7089632-2275-4236-88ee-572597e89f09","Type":"ContainerDied","Data":"3790af7fbe39385db8becc9836abae3e172278a2f9052a7c6271721d4c6fc659"} Apr 23 18:58:58.904407 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.904344 2576 scope.go:117] "RemoveContainer" containerID="1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9" Apr 23 18:58:58.914061 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.914043 2576 scope.go:117] "RemoveContainer" containerID="98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20" Apr 23 18:58:58.921868 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.921847 2576 scope.go:117] "RemoveContainer" containerID="c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792" Apr 23 18:58:58.930270 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.930249 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l"] Apr 23 18:58:58.930430 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.930409 2576 scope.go:117] "RemoveContainer" containerID="1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9" Apr 23 18:58:58.930671 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:58:58.930650 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9\": container with ID starting with 1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9 not found: ID does not exist" containerID="1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9" Apr 23 18:58:58.930756 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.930688 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9"} err="failed to get container status \"1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9\": rpc error: code = NotFound desc = could not find container \"1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9\": container with ID starting with 1df0a453a00ef5f80ca0fef592c34fc7289829f9e7a61a0bb7abc548686992f9 not found: ID does not exist" Apr 23 18:58:58.930756 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.930739 2576 scope.go:117] "RemoveContainer" containerID="98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20" Apr 23 18:58:58.930981 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:58:58.930953 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20\": container with ID starting with 98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20 not found: ID does not exist" containerID="98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20" Apr 23 18:58:58.931054 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.930977 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20"} err="failed to get container status \"98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20\": rpc error: code = NotFound desc = could not find container \"98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20\": container with ID starting with 98fa175c1327e1241f04e867a458256363dac5110254cd325be46be596000f20 not found: ID does not exist" Apr 23 18:58:58.931054 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.930999 2576 scope.go:117] "RemoveContainer" containerID="c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792" Apr 23 18:58:58.931233 ip-10-0-138-68 kubenswrapper[2576]: E0423 18:58:58.931213 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792\": container with ID starting with c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792 not found: ID does not exist" containerID="c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792" Apr 23 18:58:58.931273 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.931240 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792"} err="failed to get container status \"c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792\": rpc error: code = NotFound desc = could not find container \"c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792\": container with ID starting with c5d28f1b205e2df3a44ef33c1f4ddb014899df4260ac538b3b1f47c98742f792 not found: ID does not exist" Apr 23 18:58:58.935752 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.935731 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schevm82l"] Apr 23 18:58:58.973663 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.973637 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:58:58.973663 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.973659 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:58:58.973818 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.973670 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a7089632-2275-4236-88ee-572597e89f09-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:58:58.973818 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.973681 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a7089632-2275-4236-88ee-572597e89f09-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:58:58.973818 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:58:58.973690 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5cr6\" (UniqueName: \"kubernetes.io/projected/a7089632-2275-4236-88ee-572597e89f09-kube-api-access-p5cr6\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 18:59:00.648364 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:00.648322 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7089632-2275-4236-88ee-572597e89f09" path="/var/lib/kubelet/pods/a7089632-2275-4236-88ee-572597e89f09/volumes" Apr 23 18:59:14.065864 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.065821 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66"] Apr 23 18:59:14.066413 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.066317 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="storage-initializer" Apr 23 18:59:14.066413 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.066332 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="storage-initializer" Apr 23 18:59:14.066413 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.066346 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="main" Apr 23 18:59:14.066413 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.066352 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="main" Apr 23 18:59:14.066413 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.066361 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="tokenizer" Apr 23 18:59:14.066413 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.066367 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="tokenizer" Apr 23 18:59:14.066742 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.066434 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="main" Apr 23 18:59:14.066742 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.066446 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7089632-2275-4236-88ee-572597e89f09" containerName="tokenizer" Apr 23 18:59:14.071506 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.071483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.074379 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.074354 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-bq7vj\"" Apr 23 18:59:14.074522 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.074397 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 23 18:59:14.083373 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.083346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66"] Apr 23 18:59:14.210174 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.210139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.210174 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.210183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fvd\" (UniqueName: \"kubernetes.io/projected/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kube-api-access-99fvd\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.210446 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.210283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.210446 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.210312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.210446 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.210343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.210446 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.210366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.311471 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.311471 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.311703 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.311703 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.311703 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.311703 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99fvd\" (UniqueName: \"kubernetes.io/projected/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kube-api-access-99fvd\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.311932 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.312008 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.312008 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.311999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.312119 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.312041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.314078 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.314049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.321904 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.321850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fvd\" (UniqueName: \"kubernetes.io/projected/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kube-api-access-99fvd\") pod \"custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.382607 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.382577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:14.517343 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.517311 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66"] Apr 23 18:59:14.517939 ip-10-0-138-68 kubenswrapper[2576]: W0423 18:59:14.517913 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3640f9c1_31b8_45a1_b96e_506a79fff5a7.slice/crio-5a1fc222af894f3df4d4c2f590c09342e274b3f28a91d5cd9ccda2b5253b9c31 WatchSource:0}: Error finding container 5a1fc222af894f3df4d4c2f590c09342e274b3f28a91d5cd9ccda2b5253b9c31: Status 404 returned error can't find the container with id 5a1fc222af894f3df4d4c2f590c09342e274b3f28a91d5cd9ccda2b5253b9c31 Apr 23 18:59:14.519839 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.519820 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:59:14.970640 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.970548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" event={"ID":"3640f9c1-31b8-45a1-b96e-506a79fff5a7","Type":"ContainerStarted","Data":"7269cf17ca01905a2ee38b9cacd3f33ff528bb185d27d9e2fe28ec78acfb667e"} Apr 23 18:59:14.970640 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:14.970593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" event={"ID":"3640f9c1-31b8-45a1-b96e-506a79fff5a7","Type":"ContainerStarted","Data":"5a1fc222af894f3df4d4c2f590c09342e274b3f28a91d5cd9ccda2b5253b9c31"} Apr 23 18:59:15.975634 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:15.975597 2576 generic.go:358] "Generic (PLEG): container finished" podID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerID="7269cf17ca01905a2ee38b9cacd3f33ff528bb185d27d9e2fe28ec78acfb667e" exitCode=0 Apr 23 18:59:15.976123 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:15.975658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" event={"ID":"3640f9c1-31b8-45a1-b96e-506a79fff5a7","Type":"ContainerDied","Data":"7269cf17ca01905a2ee38b9cacd3f33ff528bb185d27d9e2fe28ec78acfb667e"} Apr 23 18:59:16.982177 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:16.982132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" event={"ID":"3640f9c1-31b8-45a1-b96e-506a79fff5a7","Type":"ContainerStarted","Data":"e0aaaa9caba6137d4260197f71468f6d2db7d11d0adfbd8e9ec530a9693fe9ac"} Apr 23 18:59:16.982177 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:16.982176 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" event={"ID":"3640f9c1-31b8-45a1-b96e-506a79fff5a7","Type":"ContainerStarted","Data":"dac2e78af20f513fc59df9d14ca5843ce2be19496e6495ff1b97683f1dd13cb8"} Apr 23 18:59:16.982614 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:16.982293 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:17.006034 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:17.005976 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" podStartSLOduration=3.005958663 podStartE2EDuration="3.005958663s" podCreationTimestamp="2026-04-23 18:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:59:17.004626213 +0000 UTC m=+4664.934638850" watchObservedRunningTime="2026-04-23 18:59:17.005958663 +0000 UTC m=+4664.935971301" Apr 23 18:59:24.383529 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:24.383439 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:24.383529 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:24.383492 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:24.386187 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:24.386158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:25.021948 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:25.021914 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 18:59:46.026456 ip-10-0-138-68 kubenswrapper[2576]: I0423 18:59:46.026424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 19:00:03.983924 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:03.983890 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs"] Apr 23 19:00:03.984441 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:03.984178 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="main" containerID="cri-o://9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5" gracePeriod=30 Apr 23 19:00:03.984441 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:03.984218 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="tokenizer" containerID="cri-o://47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc" gracePeriod=30 Apr 23 19:00:04.179122 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:04.179063 2576 generic.go:358] "Generic (PLEG): container finished" podID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerID="9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5" exitCode=0 Apr 23 19:00:04.179313 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:04.179114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" event={"ID":"5764fd45-af8b-4311-b966-f79bd178fc4c","Type":"ContainerDied","Data":"9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5"} Apr 23 19:00:05.133985 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.133962 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 19:00:05.185687 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.185655 2576 generic.go:358] "Generic (PLEG): container finished" podID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerID="47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc" exitCode=0 Apr 23 19:00:05.185897 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.185745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" event={"ID":"5764fd45-af8b-4311-b966-f79bd178fc4c","Type":"ContainerDied","Data":"47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc"} Apr 23 19:00:05.185897 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.185772 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" Apr 23 19:00:05.185897 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.185791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs" event={"ID":"5764fd45-af8b-4311-b966-f79bd178fc4c","Type":"ContainerDied","Data":"c576677019dcf5f16e251bf97cbd3c3eefb7a0ade852027ee9b17cc1a68334ff"} Apr 23 19:00:05.185897 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.185812 2576 scope.go:117] "RemoveContainer" containerID="47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc" Apr 23 19:00:05.194685 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.194665 2576 scope.go:117] "RemoveContainer" containerID="9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5" Apr 23 19:00:05.202674 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.202654 2576 scope.go:117] "RemoveContainer" containerID="088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8" Apr 23 19:00:05.210487 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.210467 2576 scope.go:117] "RemoveContainer" containerID="47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc" Apr 23 19:00:05.210761 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:00:05.210737 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc\": container with ID starting with 47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc not found: ID does not exist" containerID="47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc" Apr 23 19:00:05.210811 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.210771 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc"} err="failed to get container status \"47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc\": rpc error: code = NotFound desc = could not find container \"47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc\": container with ID starting with 47c7afb003585002ec74eee6a1ddad134a33a02e2ac04b8afcee335083c7a3fc not found: ID does not exist" Apr 23 19:00:05.210811 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.210792 2576 scope.go:117] "RemoveContainer" containerID="9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5" Apr 23 19:00:05.211011 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:00:05.210994 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5\": container with ID starting with 9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5 not found: ID does not exist" containerID="9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5" Apr 23 19:00:05.211060 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.211018 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5"} err="failed to get container status \"9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5\": rpc error: code = NotFound desc = could not find container \"9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5\": container with ID starting with 9ed8837ba187958ab5b228b416b29b3e1f359c807a61f23ad16731ad0d059fc5 not found: ID does not exist" Apr 23 19:00:05.211060 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.211035 2576 scope.go:117] "RemoveContainer" containerID="088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8" Apr 23 19:00:05.211271 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:00:05.211249 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8\": container with ID starting with 088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8 not found: ID does not exist" containerID="088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8" Apr 23 19:00:05.211370 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.211281 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8"} err="failed to get container status \"088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8\": rpc error: code = NotFound desc = could not find container \"088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8\": container with ID starting with 088b93ffc0c23118896c10739b037b912d11d5f8802fa1ff256d09509e9aaab8 not found: ID does not exist" Apr 23 19:00:05.268802 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.268769 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-kserve-provision-location\") pod \"5764fd45-af8b-4311-b966-f79bd178fc4c\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " Apr 23 19:00:05.268967 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.268823 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-uds\") pod \"5764fd45-af8b-4311-b966-f79bd178fc4c\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " Apr 23 19:00:05.268967 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.268877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-tmp\") pod \"5764fd45-af8b-4311-b966-f79bd178fc4c\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " Apr 23 19:00:05.268967 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.268902 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5764fd45-af8b-4311-b966-f79bd178fc4c-tls-certs\") pod \"5764fd45-af8b-4311-b966-f79bd178fc4c\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " Apr 23 19:00:05.268967 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.268927 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nwdz\" (UniqueName: \"kubernetes.io/projected/5764fd45-af8b-4311-b966-f79bd178fc4c-kube-api-access-2nwdz\") pod \"5764fd45-af8b-4311-b966-f79bd178fc4c\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " Apr 23 19:00:05.268967 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.268959 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-cache\") pod \"5764fd45-af8b-4311-b966-f79bd178fc4c\" (UID: \"5764fd45-af8b-4311-b966-f79bd178fc4c\") " Apr 23 19:00:05.269199 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.269172 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5764fd45-af8b-4311-b966-f79bd178fc4c" (UID: "5764fd45-af8b-4311-b966-f79bd178fc4c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:05.269327 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.269309 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:05.269405 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.269321 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5764fd45-af8b-4311-b966-f79bd178fc4c" (UID: "5764fd45-af8b-4311-b966-f79bd178fc4c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:05.269405 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.269336 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5764fd45-af8b-4311-b966-f79bd178fc4c" (UID: "5764fd45-af8b-4311-b966-f79bd178fc4c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:05.269638 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.269611 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5764fd45-af8b-4311-b966-f79bd178fc4c" (UID: "5764fd45-af8b-4311-b966-f79bd178fc4c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:05.271062 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.271037 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5764fd45-af8b-4311-b966-f79bd178fc4c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5764fd45-af8b-4311-b966-f79bd178fc4c" (UID: "5764fd45-af8b-4311-b966-f79bd178fc4c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:00:05.271186 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.271167 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5764fd45-af8b-4311-b966-f79bd178fc4c-kube-api-access-2nwdz" (OuterVolumeSpecName: "kube-api-access-2nwdz") pod "5764fd45-af8b-4311-b966-f79bd178fc4c" (UID: "5764fd45-af8b-4311-b966-f79bd178fc4c"). InnerVolumeSpecName "kube-api-access-2nwdz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:00:05.370507 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.370465 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:05.370507 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.370502 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5764fd45-af8b-4311-b966-f79bd178fc4c-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:05.370507 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.370512 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nwdz\" (UniqueName: \"kubernetes.io/projected/5764fd45-af8b-4311-b966-f79bd178fc4c-kube-api-access-2nwdz\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:05.370791 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.370523 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:05.370791 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.370533 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5764fd45-af8b-4311-b966-f79bd178fc4c-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:05.511621 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.511590 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs"] Apr 23 19:00:05.515222 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:05.515194 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-86f46sqqjs"] Apr 23 19:00:06.654116 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:06.654079 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" path="/var/lib/kubelet/pods/5764fd45-af8b-4311-b966-f79bd178fc4c/volumes" Apr 23 19:00:21.819203 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819170 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2"] Apr 23 19:00:21.819800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="storage-initializer" Apr 23 19:00:21.819800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819701 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="storage-initializer" Apr 23 19:00:21.819800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819730 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="main" Apr 23 19:00:21.819800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819738 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="main" Apr 23 19:00:21.819800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819754 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="tokenizer" Apr 23 19:00:21.819800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819760 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="tokenizer" Apr 23 19:00:21.820253 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819829 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="main" Apr 23 19:00:21.820253 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.819841 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5764fd45-af8b-4311-b966-f79bd178fc4c" containerName="tokenizer" Apr 23 19:00:21.823638 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.823615 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:21.826641 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.826614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 23 19:00:21.826790 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.826685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-ltz4s\"" Apr 23 19:00:21.833788 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.833764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2"] Apr 23 19:00:21.924839 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.924799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:21.925028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.924898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx27s\" (UniqueName: \"kubernetes.io/projected/45faca15-6246-41e6-b4d1-4beb4a08584b-kube-api-access-zx27s\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:21.925028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.924951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:21.925028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.924976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:21.925028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.925010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45faca15-6246-41e6-b4d1-4beb4a08584b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:21.925479 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:21.925055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.026452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.026402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.026655 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.026540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx27s\" (UniqueName: \"kubernetes.io/projected/45faca15-6246-41e6-b4d1-4beb4a08584b-kube-api-access-zx27s\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.026655 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.026571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.026655 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.026591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.026655 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.026614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45faca15-6246-41e6-b4d1-4beb4a08584b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.026966 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.026768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.026966 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.026904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.026966 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.026935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.027086 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.027002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.027134 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.027110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.029136 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.029117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45faca15-6246-41e6-b4d1-4beb4a08584b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.035983 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.035948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx27s\" (UniqueName: \"kubernetes.io/projected/45faca15-6246-41e6-b4d1-4beb4a08584b-kube-api-access-zx27s\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.135448 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.135362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:22.481787 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:22.481754 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2"] Apr 23 19:00:22.483162 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:00:22.483132 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45faca15_6246_41e6_b4d1_4beb4a08584b.slice/crio-a87d165b93b086a19810f09ca65226500d8328c2919bbe54c6b563f8e5c38fce WatchSource:0}: Error finding container a87d165b93b086a19810f09ca65226500d8328c2919bbe54c6b563f8e5c38fce: Status 404 returned error can't find the container with id a87d165b93b086a19810f09ca65226500d8328c2919bbe54c6b563f8e5c38fce Apr 23 19:00:23.265082 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:23.265052 2576 generic.go:358] "Generic (PLEG): container finished" podID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerID="1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1" exitCode=0 Apr 23 19:00:23.265419 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:23.265147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" event={"ID":"45faca15-6246-41e6-b4d1-4beb4a08584b","Type":"ContainerDied","Data":"1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1"} Apr 23 19:00:23.265419 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:23.265188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" event={"ID":"45faca15-6246-41e6-b4d1-4beb4a08584b","Type":"ContainerStarted","Data":"a87d165b93b086a19810f09ca65226500d8328c2919bbe54c6b563f8e5c38fce"} Apr 23 19:00:24.271439 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:24.271401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" event={"ID":"45faca15-6246-41e6-b4d1-4beb4a08584b","Type":"ContainerStarted","Data":"74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d"} Apr 23 19:00:24.271439 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:24.271438 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" event={"ID":"45faca15-6246-41e6-b4d1-4beb4a08584b","Type":"ContainerStarted","Data":"bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e"} Apr 23 19:00:24.271900 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:24.271553 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:24.302220 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:24.302173 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" podStartSLOduration=3.302158321 podStartE2EDuration="3.302158321s" podCreationTimestamp="2026-04-23 19:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:00:24.299421481 +0000 UTC m=+4732.229434118" watchObservedRunningTime="2026-04-23 19:00:24.302158321 +0000 UTC m=+4732.232170956" Apr 23 19:00:32.136431 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:32.136394 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:32.136431 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:32.136445 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:32.139889 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:32.139865 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:32.306289 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:32.306255 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:53.311083 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:53.311002 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:54.724403 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:54.724364 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2"] Apr 23 19:00:54.724851 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:54.724696 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="main" containerID="cri-o://bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e" gracePeriod=30 Apr 23 19:00:54.724851 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:54.724790 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="tokenizer" containerID="cri-o://74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d" gracePeriod=30 Apr 23 19:00:55.399353 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:55.399317 2576 generic.go:358] "Generic (PLEG): container finished" podID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerID="bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e" exitCode=0 Apr 23 19:00:55.399530 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:55.399386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" event={"ID":"45faca15-6246-41e6-b4d1-4beb4a08584b","Type":"ContainerDied","Data":"bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e"} Apr 23 19:00:56.102006 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.101941 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:56.233815 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.233784 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-uds\") pod \"45faca15-6246-41e6-b4d1-4beb4a08584b\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " Apr 23 19:00:56.233815 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.233819 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-cache\") pod \"45faca15-6246-41e6-b4d1-4beb4a08584b\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " Apr 23 19:00:56.234080 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.233845 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-tmp\") pod \"45faca15-6246-41e6-b4d1-4beb4a08584b\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " Apr 23 19:00:56.234080 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.233903 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx27s\" (UniqueName: \"kubernetes.io/projected/45faca15-6246-41e6-b4d1-4beb4a08584b-kube-api-access-zx27s\") pod \"45faca15-6246-41e6-b4d1-4beb4a08584b\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " Apr 23 19:00:56.234080 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.233952 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45faca15-6246-41e6-b4d1-4beb4a08584b-tls-certs\") pod \"45faca15-6246-41e6-b4d1-4beb4a08584b\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " Apr 23 19:00:56.234080 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.233980 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-kserve-provision-location\") pod \"45faca15-6246-41e6-b4d1-4beb4a08584b\" (UID: \"45faca15-6246-41e6-b4d1-4beb4a08584b\") " Apr 23 19:00:56.234251 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.234071 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "45faca15-6246-41e6-b4d1-4beb4a08584b" (UID: "45faca15-6246-41e6-b4d1-4beb4a08584b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:56.234251 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.234119 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "45faca15-6246-41e6-b4d1-4beb4a08584b" (UID: "45faca15-6246-41e6-b4d1-4beb4a08584b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:56.234251 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.234234 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "45faca15-6246-41e6-b4d1-4beb4a08584b" (UID: "45faca15-6246-41e6-b4d1-4beb4a08584b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:56.234362 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.234321 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:56.234362 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.234336 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:56.234362 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.234346 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:56.234763 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.234739 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "45faca15-6246-41e6-b4d1-4beb4a08584b" (UID: "45faca15-6246-41e6-b4d1-4beb4a08584b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:00:56.236061 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.236040 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45faca15-6246-41e6-b4d1-4beb4a08584b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "45faca15-6246-41e6-b4d1-4beb4a08584b" (UID: "45faca15-6246-41e6-b4d1-4beb4a08584b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:00:56.236109 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.236090 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45faca15-6246-41e6-b4d1-4beb4a08584b-kube-api-access-zx27s" (OuterVolumeSpecName: "kube-api-access-zx27s") pod "45faca15-6246-41e6-b4d1-4beb4a08584b" (UID: "45faca15-6246-41e6-b4d1-4beb4a08584b"). InnerVolumeSpecName "kube-api-access-zx27s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:00:56.335236 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.335192 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zx27s\" (UniqueName: \"kubernetes.io/projected/45faca15-6246-41e6-b4d1-4beb4a08584b-kube-api-access-zx27s\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:56.335236 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.335229 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45faca15-6246-41e6-b4d1-4beb4a08584b-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:56.335236 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.335245 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45faca15-6246-41e6-b4d1-4beb4a08584b-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:00:56.405408 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.405312 2576 generic.go:358] "Generic (PLEG): container finished" podID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerID="74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d" exitCode=0 Apr 23 19:00:56.405408 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.405397 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" Apr 23 19:00:56.405635 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.405393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" event={"ID":"45faca15-6246-41e6-b4d1-4beb4a08584b","Type":"ContainerDied","Data":"74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d"} Apr 23 19:00:56.405635 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.405509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2" event={"ID":"45faca15-6246-41e6-b4d1-4beb4a08584b","Type":"ContainerDied","Data":"a87d165b93b086a19810f09ca65226500d8328c2919bbe54c6b563f8e5c38fce"} Apr 23 19:00:56.405635 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.405525 2576 scope.go:117] "RemoveContainer" containerID="74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d" Apr 23 19:00:56.416908 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.416841 2576 scope.go:117] "RemoveContainer" containerID="bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e" Apr 23 19:00:56.425468 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.425445 2576 scope.go:117] "RemoveContainer" containerID="1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1" Apr 23 19:00:56.431018 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.430993 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2"] Apr 23 19:00:56.435476 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.435455 2576 scope.go:117] "RemoveContainer" containerID="74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d" Apr 23 19:00:56.435834 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:00:56.435809 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d\": container with ID starting with 74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d not found: ID does not exist" containerID="74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d" Apr 23 19:00:56.435928 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.435845 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d"} err="failed to get container status \"74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d\": rpc error: code = NotFound desc = could not find container \"74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d\": container with ID starting with 74275bf61a9b6f7b7d02e117d5d0394c8a027e1fc2a005ef1483b45e90b1183d not found: ID does not exist" Apr 23 19:00:56.435928 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.435866 2576 scope.go:117] "RemoveContainer" containerID="bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e" Apr 23 19:00:56.436144 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:00:56.436125 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e\": container with ID starting with bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e not found: ID does not exist" containerID="bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e" Apr 23 19:00:56.436195 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.436149 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e"} err="failed to get container status \"bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e\": rpc error: code = NotFound desc = could not find container \"bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e\": container with ID starting with bf7f8dca669de450042f15684b8f6b0bfd13029cd0436ff7cf0512edd846817e not found: ID does not exist" Apr 23 19:00:56.436195 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.436166 2576 scope.go:117] "RemoveContainer" containerID="1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1" Apr 23 19:00:56.436403 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:00:56.436384 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1\": container with ID starting with 1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1 not found: ID does not exist" containerID="1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1" Apr 23 19:00:56.436469 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.436406 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1"} err="failed to get container status \"1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1\": rpc error: code = NotFound desc = could not find container \"1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1\": container with ID starting with 1ed70f076b0f79a3cde9fb7fe1ee3faa46e67f11f6d9f3ef7ca3acaca69c34d1 not found: ID does not exist" Apr 23 19:00:56.437482 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.437463 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-57f8f59sszp2"] Apr 23 19:00:56.648660 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:00:56.648630 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" path="/var/lib/kubelet/pods/45faca15-6246-41e6-b4d1-4beb4a08584b/volumes" Apr 23 19:01:05.977200 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977161 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh"] Apr 23 19:01:05.977671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977554 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="storage-initializer" Apr 23 19:01:05.977671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977565 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="storage-initializer" Apr 23 19:01:05.977671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977576 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="tokenizer" Apr 23 19:01:05.977671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977581 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="tokenizer" Apr 23 19:01:05.977671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977590 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="main" Apr 23 19:01:05.977671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977596 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="main" Apr 23 19:01:05.977671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977653 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="tokenizer" Apr 23 19:01:05.977671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.977662 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="45faca15-6246-41e6-b4d1-4beb4a08584b" containerName="main" Apr 23 19:01:05.982625 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.982604 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:05.986222 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.986199 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-jthzm\"" Apr 23 19:01:05.986739 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.986708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 23 19:01:05.993799 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:05.993767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh"] Apr 23 19:01:06.130883 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.130853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30030cc6-635c-4640-9dfa-bfec151ccdba-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.130883 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.130891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.131084 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.130912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl8s6\" (UniqueName: \"kubernetes.io/projected/30030cc6-635c-4640-9dfa-bfec151ccdba-kube-api-access-tl8s6\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.131084 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.131039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.131084 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.131069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.131199 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.131147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232241 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30030cc6-635c-4640-9dfa-bfec151ccdba-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232241 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232241 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl8s6\" (UniqueName: \"kubernetes.io/projected/30030cc6-635c-4640-9dfa-bfec151ccdba-kube-api-access-tl8s6\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232538 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232538 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232538 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232706 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232706 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232825 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.232825 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.232785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.234681 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.234665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30030cc6-635c-4640-9dfa-bfec151ccdba-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.259991 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.259952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl8s6\" (UniqueName: \"kubernetes.io/projected/30030cc6-635c-4640-9dfa-bfec151ccdba-kube-api-access-tl8s6\") pod \"precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.293815 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.293769 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:06.443061 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:06.443031 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh"] Apr 23 19:01:06.444146 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:01:06.444100 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30030cc6_635c_4640_9dfa_bfec151ccdba.slice/crio-bd6e6aae39f7377bf660fd88d3f7e44fc363db4860bd033362f15e02f09b8d25 WatchSource:0}: Error finding container bd6e6aae39f7377bf660fd88d3f7e44fc363db4860bd033362f15e02f09b8d25: Status 404 returned error can't find the container with id bd6e6aae39f7377bf660fd88d3f7e44fc363db4860bd033362f15e02f09b8d25 Apr 23 19:01:07.451951 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:07.451908 2576 generic.go:358] "Generic (PLEG): container finished" podID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerID="bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083" exitCode=0 Apr 23 19:01:07.452360 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:07.451996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" event={"ID":"30030cc6-635c-4640-9dfa-bfec151ccdba","Type":"ContainerDied","Data":"bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083"} Apr 23 19:01:07.452360 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:07.452034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" event={"ID":"30030cc6-635c-4640-9dfa-bfec151ccdba","Type":"ContainerStarted","Data":"bd6e6aae39f7377bf660fd88d3f7e44fc363db4860bd033362f15e02f09b8d25"} Apr 23 19:01:08.458665 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:08.458631 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" event={"ID":"30030cc6-635c-4640-9dfa-bfec151ccdba","Type":"ContainerStarted","Data":"aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899"} Apr 23 19:01:08.458665 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:08.458668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" event={"ID":"30030cc6-635c-4640-9dfa-bfec151ccdba","Type":"ContainerStarted","Data":"358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1"} Apr 23 19:01:08.459101 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:08.458882 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:08.492903 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:08.492846 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" podStartSLOduration=3.492831272 podStartE2EDuration="3.492831272s" podCreationTimestamp="2026-04-23 19:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:01:08.489643634 +0000 UTC m=+4776.419656268" watchObservedRunningTime="2026-04-23 19:01:08.492831272 +0000 UTC m=+4776.422843909" Apr 23 19:01:16.294168 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:16.294122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:16.294168 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:16.294183 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:16.295514 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:01:16.295483 2576 logging.go:55] [core] [Channel #2185 SubChannel #2186]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.49:9003", ServerName: "10.133.0.49:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.49:9003: connect: connection refused" Apr 23 19:01:16.296812 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:16.296787 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:16.496121 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:16.496092 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:17.294618 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:17.294576 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.49:9003\" within 1s: context deadline exceeded" Apr 23 19:01:26.294645 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:01:26.294613 2576 logging.go:55] [core] [Channel #2193 SubChannel #2194]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.49:9003", ServerName: "10.133.0.49:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.49:9003: connect: connection refused" Apr 23 19:01:27.294405 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:27.294358 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.49:9003\" within 1s: context deadline exceeded" Apr 23 19:01:32.975306 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:32.975201 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:01:32.989381 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:32.978927 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:01:32.989381 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:32.988684 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:01:32.992192 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:32.992174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:01:37.500656 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:37.500622 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:38.307147 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:38.307114 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh"] Apr 23 19:01:38.307453 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:38.307430 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="main" containerID="cri-o://358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1" gracePeriod=30 Apr 23 19:01:38.307542 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:38.307468 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="tokenizer" containerID="cri-o://aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899" gracePeriod=30 Apr 23 19:01:38.582679 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:38.582647 2576 generic.go:358] "Generic (PLEG): container finished" podID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerID="358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1" exitCode=0 Apr 23 19:01:38.583115 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:38.582734 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" event={"ID":"30030cc6-635c-4640-9dfa-bfec151ccdba","Type":"ContainerDied","Data":"358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1"} Apr 23 19:01:39.862946 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.862922 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:39.952115 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-tmp\") pod \"30030cc6-635c-4640-9dfa-bfec151ccdba\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " Apr 23 19:01:39.952265 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952146 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl8s6\" (UniqueName: \"kubernetes.io/projected/30030cc6-635c-4640-9dfa-bfec151ccdba-kube-api-access-tl8s6\") pod \"30030cc6-635c-4640-9dfa-bfec151ccdba\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " Apr 23 19:01:39.952265 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952190 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-kserve-provision-location\") pod \"30030cc6-635c-4640-9dfa-bfec151ccdba\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " Apr 23 19:01:39.952265 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952231 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30030cc6-635c-4640-9dfa-bfec151ccdba-tls-certs\") pod \"30030cc6-635c-4640-9dfa-bfec151ccdba\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " Apr 23 19:01:39.952408 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952266 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-cache\") pod \"30030cc6-635c-4640-9dfa-bfec151ccdba\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " Apr 23 19:01:39.952408 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952292 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-uds\") pod \"30030cc6-635c-4640-9dfa-bfec151ccdba\" (UID: \"30030cc6-635c-4640-9dfa-bfec151ccdba\") " Apr 23 19:01:39.952517 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952454 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "30030cc6-635c-4640-9dfa-bfec151ccdba" (UID: "30030cc6-635c-4640-9dfa-bfec151ccdba"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.952634 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952605 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "30030cc6-635c-4640-9dfa-bfec151ccdba" (UID: "30030cc6-635c-4640-9dfa-bfec151ccdba"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.952697 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952622 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:01:39.952778 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.952686 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "30030cc6-635c-4640-9dfa-bfec151ccdba" (UID: "30030cc6-635c-4640-9dfa-bfec151ccdba"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.953133 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.953104 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30030cc6-635c-4640-9dfa-bfec151ccdba" (UID: "30030cc6-635c-4640-9dfa-bfec151ccdba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:01:39.954365 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.954340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30030cc6-635c-4640-9dfa-bfec151ccdba-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "30030cc6-635c-4640-9dfa-bfec151ccdba" (UID: "30030cc6-635c-4640-9dfa-bfec151ccdba"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:01:39.954437 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:39.954370 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30030cc6-635c-4640-9dfa-bfec151ccdba-kube-api-access-tl8s6" (OuterVolumeSpecName: "kube-api-access-tl8s6") pod "30030cc6-635c-4640-9dfa-bfec151ccdba" (UID: "30030cc6-635c-4640-9dfa-bfec151ccdba"). InnerVolumeSpecName "kube-api-access-tl8s6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:01:40.054096 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.054053 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30030cc6-635c-4640-9dfa-bfec151ccdba-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:01:40.054096 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.054089 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:01:40.054096 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.054101 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:01:40.054344 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.054116 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tl8s6\" (UniqueName: \"kubernetes.io/projected/30030cc6-635c-4640-9dfa-bfec151ccdba-kube-api-access-tl8s6\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:01:40.054344 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.054130 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30030cc6-635c-4640-9dfa-bfec151ccdba-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:01:40.593198 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.593164 2576 generic.go:358] "Generic (PLEG): container finished" podID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerID="aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899" exitCode=0 Apr 23 19:01:40.593381 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.593252 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" Apr 23 19:01:40.593381 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.593249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" event={"ID":"30030cc6-635c-4640-9dfa-bfec151ccdba","Type":"ContainerDied","Data":"aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899"} Apr 23 19:01:40.593381 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.593291 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh" event={"ID":"30030cc6-635c-4640-9dfa-bfec151ccdba","Type":"ContainerDied","Data":"bd6e6aae39f7377bf660fd88d3f7e44fc363db4860bd033362f15e02f09b8d25"} Apr 23 19:01:40.593381 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.593313 2576 scope.go:117] "RemoveContainer" containerID="aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899" Apr 23 19:01:40.601998 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.601973 2576 scope.go:117] "RemoveContainer" containerID="358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1" Apr 23 19:01:40.610553 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.610531 2576 scope.go:117] "RemoveContainer" containerID="bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083" Apr 23 19:01:40.621919 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.621325 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh"] Apr 23 19:01:40.622810 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.622784 2576 scope.go:117] "RemoveContainer" containerID="aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899" Apr 23 19:01:40.623195 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:01:40.623167 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899\": container with ID starting with aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899 not found: ID does not exist" containerID="aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899" Apr 23 19:01:40.623304 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.623210 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899"} err="failed to get container status \"aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899\": rpc error: code = NotFound desc = could not find container \"aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899\": container with ID starting with aa811eae0a8ccf4991fe6dce0d69d1d460c735448cc0c66fe209c77af25cc899 not found: ID does not exist" Apr 23 19:01:40.623304 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.623238 2576 scope.go:117] "RemoveContainer" containerID="358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1" Apr 23 19:01:40.623578 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:01:40.623531 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1\": container with ID starting with 358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1 not found: ID does not exist" containerID="358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1" Apr 23 19:01:40.623687 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.623586 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1"} err="failed to get container status \"358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1\": rpc error: code = NotFound desc = could not find container \"358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1\": container with ID starting with 358d1d9d82aab32637c9b3813f3e26e2c76676b066c80dfe655b1936dc1056e1 not found: ID does not exist" Apr 23 19:01:40.623687 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.623604 2576 scope.go:117] "RemoveContainer" containerID="bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083" Apr 23 19:01:40.624571 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:01:40.624099 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083\": container with ID starting with bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083 not found: ID does not exist" containerID="bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083" Apr 23 19:01:40.624571 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.624134 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083"} err="failed to get container status \"bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083\": rpc error: code = NotFound desc = could not find container \"bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083\": container with ID starting with bf034a07f3e831cbef33f920427cbb1e46359cc479954776c896b753f7637083 not found: ID does not exist" Apr 23 19:01:40.627037 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.627017 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-55c45dd7scpbh"] Apr 23 19:01:40.648160 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:01:40.648134 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" path="/var/lib/kubelet/pods/30030cc6-635c-4640-9dfa-bfec151ccdba/volumes" Apr 23 19:03:32.868398 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:32.868358 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66"] Apr 23 19:03:32.868941 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:32.868818 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="main" containerID="cri-o://dac2e78af20f513fc59df9d14ca5843ce2be19496e6495ff1b97683f1dd13cb8" gracePeriod=30 Apr 23 19:03:32.868941 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:32.868859 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="tokenizer" containerID="cri-o://e0aaaa9caba6137d4260197f71468f6d2db7d11d0adfbd8e9ec530a9693fe9ac" gracePeriod=30 Apr 23 19:03:33.064437 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:33.064401 2576 generic.go:358] "Generic (PLEG): container finished" podID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerID="dac2e78af20f513fc59df9d14ca5843ce2be19496e6495ff1b97683f1dd13cb8" exitCode=0 Apr 23 19:03:33.064603 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:33.064470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" event={"ID":"3640f9c1-31b8-45a1-b96e-506a79fff5a7","Type":"ContainerDied","Data":"dac2e78af20f513fc59df9d14ca5843ce2be19496e6495ff1b97683f1dd13cb8"} Apr 23 19:03:34.070499 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.070466 2576 generic.go:358] "Generic (PLEG): container finished" podID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerID="e0aaaa9caba6137d4260197f71468f6d2db7d11d0adfbd8e9ec530a9693fe9ac" exitCode=0 Apr 23 19:03:34.070893 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.070536 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" event={"ID":"3640f9c1-31b8-45a1-b96e-506a79fff5a7","Type":"ContainerDied","Data":"e0aaaa9caba6137d4260197f71468f6d2db7d11d0adfbd8e9ec530a9693fe9ac"} Apr 23 19:03:34.126443 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.126385 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 19:03:34.247912 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.247878 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-tmp\") pod \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " Apr 23 19:03:34.248101 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.247924 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-uds\") pod \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " Apr 23 19:03:34.248101 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.247954 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-cache\") pod \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " Apr 23 19:03:34.248101 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.248005 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99fvd\" (UniqueName: \"kubernetes.io/projected/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kube-api-access-99fvd\") pod \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " Apr 23 19:03:34.248101 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.248030 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kserve-provision-location\") pod \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " Apr 23 19:03:34.248101 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.248082 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tls-certs\") pod \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\" (UID: \"3640f9c1-31b8-45a1-b96e-506a79fff5a7\") " Apr 23 19:03:34.248375 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.248232 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3640f9c1-31b8-45a1-b96e-506a79fff5a7" (UID: "3640f9c1-31b8-45a1-b96e-506a79fff5a7"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:03:34.248375 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.248331 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3640f9c1-31b8-45a1-b96e-506a79fff5a7" (UID: "3640f9c1-31b8-45a1-b96e-506a79fff5a7"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:03:34.248375 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.248345 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3640f9c1-31b8-45a1-b96e-506a79fff5a7" (UID: "3640f9c1-31b8-45a1-b96e-506a79fff5a7"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:03:34.248375 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.248366 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:03:34.248821 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.248800 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3640f9c1-31b8-45a1-b96e-506a79fff5a7" (UID: "3640f9c1-31b8-45a1-b96e-506a79fff5a7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:03:34.250173 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.250152 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kube-api-access-99fvd" (OuterVolumeSpecName: "kube-api-access-99fvd") pod "3640f9c1-31b8-45a1-b96e-506a79fff5a7" (UID: "3640f9c1-31b8-45a1-b96e-506a79fff5a7"). InnerVolumeSpecName "kube-api-access-99fvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:03:34.250243 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.250205 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3640f9c1-31b8-45a1-b96e-506a79fff5a7" (UID: "3640f9c1-31b8-45a1-b96e-506a79fff5a7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:03:34.349140 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.349090 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99fvd\" (UniqueName: \"kubernetes.io/projected/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kube-api-access-99fvd\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:03:34.349140 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.349133 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:03:34.349140 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.349161 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:03:34.349394 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.349171 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:03:34.349394 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:34.349180 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3640f9c1-31b8-45a1-b96e-506a79fff5a7-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:03:35.075851 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:35.075814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" event={"ID":"3640f9c1-31b8-45a1-b96e-506a79fff5a7","Type":"ContainerDied","Data":"5a1fc222af894f3df4d4c2f590c09342e274b3f28a91d5cd9ccda2b5253b9c31"} Apr 23 19:03:35.076309 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:35.075864 2576 scope.go:117] "RemoveContainer" containerID="e0aaaa9caba6137d4260197f71468f6d2db7d11d0adfbd8e9ec530a9693fe9ac" Apr 23 19:03:35.076309 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:35.075833 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66" Apr 23 19:03:35.084488 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:35.084470 2576 scope.go:117] "RemoveContainer" containerID="dac2e78af20f513fc59df9d14ca5843ce2be19496e6495ff1b97683f1dd13cb8" Apr 23 19:03:35.091988 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:35.091962 2576 scope.go:117] "RemoveContainer" containerID="7269cf17ca01905a2ee38b9cacd3f33ff528bb185d27d9e2fe28ec78acfb667e" Apr 23 19:03:35.098849 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:35.098823 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66"] Apr 23 19:03:35.105662 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:35.105642 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7fccfb7cpgx66"] Apr 23 19:03:36.650799 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:36.650754 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" path="/var/lib/kubelet/pods/3640f9c1-31b8-45a1-b96e-506a79fff5a7/volumes" Apr 23 19:03:44.720497 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720450 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr"] Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720833 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="storage-initializer" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720847 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="storage-initializer" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720865 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="storage-initializer" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720874 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="storage-initializer" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720883 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="main" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720901 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="main" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720907 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="main" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720912 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="main" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720930 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="tokenizer" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720938 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="tokenizer" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720950 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="tokenizer" Apr 23 19:03:44.721011 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.720956 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="tokenizer" Apr 23 19:03:44.721377 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.721023 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="tokenizer" Apr 23 19:03:44.721377 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.721036 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3640f9c1-31b8-45a1-b96e-506a79fff5a7" containerName="main" Apr 23 19:03:44.721377 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.721045 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="main" Apr 23 19:03:44.721377 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.721052 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30030cc6-635c-4640-9dfa-bfec151ccdba" containerName="tokenizer" Apr 23 19:03:44.724417 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.724396 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.728595 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.728570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:03:44.728763 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.728573 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 23 19:03:44.728763 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.728689 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-vcxdp\"" Apr 23 19:03:44.735248 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.735225 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr"] Apr 23 19:03:44.746596 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.746570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.746733 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.746600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.746733 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.746706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbxfx\" (UniqueName: \"kubernetes.io/projected/0fa536e4-594e-4565-af0d-c3e3372bde7e-kube-api-access-tbxfx\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.746832 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.746777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.746832 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.746803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa536e4-594e-4565-af0d-c3e3372bde7e-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.746909 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.746832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.847658 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.847621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbxfx\" (UniqueName: \"kubernetes.io/projected/0fa536e4-594e-4565-af0d-c3e3372bde7e-kube-api-access-tbxfx\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.847658 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.847659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.847914 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.847682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa536e4-594e-4565-af0d-c3e3372bde7e-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.847914 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.847802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.847914 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.847902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.848062 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.847932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.848117 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.848074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.848175 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.848146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.848231 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.848203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.848323 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.848302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.850100 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.850080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa536e4-594e-4565-af0d-c3e3372bde7e-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:44.857372 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:44.857338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbxfx\" (UniqueName: \"kubernetes.io/projected/0fa536e4-594e-4565-af0d-c3e3372bde7e-kube-api-access-tbxfx\") pod \"router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:45.034874 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:45.034778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:45.179388 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:45.179364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr"] Apr 23 19:03:45.180297 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:03:45.180269 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fa536e4_594e_4565_af0d_c3e3372bde7e.slice/crio-a50ff049439faa617f0f0bed79dd790e722e3e99c2ebfc6a6f0aae259e34cc28 WatchSource:0}: Error finding container a50ff049439faa617f0f0bed79dd790e722e3e99c2ebfc6a6f0aae259e34cc28: Status 404 returned error can't find the container with id a50ff049439faa617f0f0bed79dd790e722e3e99c2ebfc6a6f0aae259e34cc28 Apr 23 19:03:46.124139 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:46.124102 2576 generic.go:358] "Generic (PLEG): container finished" podID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerID="c49905b093db1acadda4d374eb5fe7ffd4074865a9a7338e196da3ce75537a84" exitCode=0 Apr 23 19:03:46.124545 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:46.124187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" event={"ID":"0fa536e4-594e-4565-af0d-c3e3372bde7e","Type":"ContainerDied","Data":"c49905b093db1acadda4d374eb5fe7ffd4074865a9a7338e196da3ce75537a84"} Apr 23 19:03:46.124545 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:46.124222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" event={"ID":"0fa536e4-594e-4565-af0d-c3e3372bde7e","Type":"ContainerStarted","Data":"a50ff049439faa617f0f0bed79dd790e722e3e99c2ebfc6a6f0aae259e34cc28"} Apr 23 19:03:47.129955 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:47.129917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" event={"ID":"0fa536e4-594e-4565-af0d-c3e3372bde7e","Type":"ContainerStarted","Data":"ae3d4d09119fe540a848b407ade93baed6a0dca039398fe90a1b47a006c489ab"} Apr 23 19:03:47.129955 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:47.129957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" event={"ID":"0fa536e4-594e-4565-af0d-c3e3372bde7e","Type":"ContainerStarted","Data":"df6d7dc0c1d311565e7e0f279bb4402f55c94c523d2ccc432666673abe3a67ad"} Apr 23 19:03:47.130517 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:47.129988 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:47.160410 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:47.160361 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" podStartSLOduration=3.160344144 podStartE2EDuration="3.160344144s" podCreationTimestamp="2026-04-23 19:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:03:47.158062985 +0000 UTC m=+4935.088075654" watchObservedRunningTime="2026-04-23 19:03:47.160344144 +0000 UTC m=+4935.090356780" Apr 23 19:03:55.035654 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:55.035566 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:55.035654 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:55.035619 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:55.038455 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:55.038425 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:03:55.165300 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:03:55.165270 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:04:16.170225 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:04:16.170193 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:06:33.012216 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.012101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:06:33.015883 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.015866 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:06:33.025794 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.025768 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:06:33.029205 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.029186 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:06:33.566083 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.566044 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr"] Apr 23 19:06:33.566414 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.566369 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="main" containerID="cri-o://df6d7dc0c1d311565e7e0f279bb4402f55c94c523d2ccc432666673abe3a67ad" gracePeriod=30 Apr 23 19:06:33.566473 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.566394 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="tokenizer" containerID="cri-o://ae3d4d09119fe540a848b407ade93baed6a0dca039398fe90a1b47a006c489ab" gracePeriod=30 Apr 23 19:06:33.803181 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.803151 2576 generic.go:358] "Generic (PLEG): container finished" podID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerID="df6d7dc0c1d311565e7e0f279bb4402f55c94c523d2ccc432666673abe3a67ad" exitCode=0 Apr 23 19:06:33.803394 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:33.803215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" event={"ID":"0fa536e4-594e-4565-af0d-c3e3372bde7e","Type":"ContainerDied","Data":"df6d7dc0c1d311565e7e0f279bb4402f55c94c523d2ccc432666673abe3a67ad"} Apr 23 19:06:34.809421 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:34.809388 2576 generic.go:358] "Generic (PLEG): container finished" podID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerID="ae3d4d09119fe540a848b407ade93baed6a0dca039398fe90a1b47a006c489ab" exitCode=0 Apr 23 19:06:34.809798 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:34.809457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" event={"ID":"0fa536e4-594e-4565-af0d-c3e3372bde7e","Type":"ContainerDied","Data":"ae3d4d09119fe540a848b407ade93baed6a0dca039398fe90a1b47a006c489ab"} Apr 23 19:06:34.919200 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:34.919176 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:06:35.058915 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.058887 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbxfx\" (UniqueName: \"kubernetes.io/projected/0fa536e4-594e-4565-af0d-c3e3372bde7e-kube-api-access-tbxfx\") pod \"0fa536e4-594e-4565-af0d-c3e3372bde7e\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " Apr 23 19:06:35.059088 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.058922 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-kserve-provision-location\") pod \"0fa536e4-594e-4565-af0d-c3e3372bde7e\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " Apr 23 19:06:35.059088 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.058971 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-uds\") pod \"0fa536e4-594e-4565-af0d-c3e3372bde7e\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " Apr 23 19:06:35.059088 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.059034 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa536e4-594e-4565-af0d-c3e3372bde7e-tls-certs\") pod \"0fa536e4-594e-4565-af0d-c3e3372bde7e\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " Apr 23 19:06:35.059088 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.059081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-tmp\") pod \"0fa536e4-594e-4565-af0d-c3e3372bde7e\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " Apr 23 19:06:35.059371 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.059103 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-cache\") pod \"0fa536e4-594e-4565-af0d-c3e3372bde7e\" (UID: \"0fa536e4-594e-4565-af0d-c3e3372bde7e\") " Apr 23 19:06:35.059371 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.059274 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0fa536e4-594e-4565-af0d-c3e3372bde7e" (UID: "0fa536e4-594e-4565-af0d-c3e3372bde7e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:06:35.059490 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.059465 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0fa536e4-594e-4565-af0d-c3e3372bde7e" (UID: "0fa536e4-594e-4565-af0d-c3e3372bde7e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:06:35.059534 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.059482 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0fa536e4-594e-4565-af0d-c3e3372bde7e" (UID: "0fa536e4-594e-4565-af0d-c3e3372bde7e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:06:35.059860 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.059839 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0fa536e4-594e-4565-af0d-c3e3372bde7e" (UID: "0fa536e4-594e-4565-af0d-c3e3372bde7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:06:35.061183 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.061160 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa536e4-594e-4565-af0d-c3e3372bde7e-kube-api-access-tbxfx" (OuterVolumeSpecName: "kube-api-access-tbxfx") pod "0fa536e4-594e-4565-af0d-c3e3372bde7e" (UID: "0fa536e4-594e-4565-af0d-c3e3372bde7e"). InnerVolumeSpecName "kube-api-access-tbxfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:06:35.061256 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.061197 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa536e4-594e-4565-af0d-c3e3372bde7e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0fa536e4-594e-4565-af0d-c3e3372bde7e" (UID: "0fa536e4-594e-4565-af0d-c3e3372bde7e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:06:35.160489 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.160456 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:06:35.160489 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.160487 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:06:35.160732 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.160503 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbxfx\" (UniqueName: \"kubernetes.io/projected/0fa536e4-594e-4565-af0d-c3e3372bde7e-kube-api-access-tbxfx\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:06:35.160732 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.160519 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:06:35.160732 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.160531 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0fa536e4-594e-4565-af0d-c3e3372bde7e-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:06:35.160732 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.160543 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa536e4-594e-4565-af0d-c3e3372bde7e-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:06:35.815275 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.815239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" event={"ID":"0fa536e4-594e-4565-af0d-c3e3372bde7e","Type":"ContainerDied","Data":"a50ff049439faa617f0f0bed79dd790e722e3e99c2ebfc6a6f0aae259e34cc28"} Apr 23 19:06:35.815275 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.815260 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr" Apr 23 19:06:35.815788 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.815287 2576 scope.go:117] "RemoveContainer" containerID="ae3d4d09119fe540a848b407ade93baed6a0dca039398fe90a1b47a006c489ab" Apr 23 19:06:35.830637 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.830608 2576 scope.go:117] "RemoveContainer" containerID="df6d7dc0c1d311565e7e0f279bb4402f55c94c523d2ccc432666673abe3a67ad" Apr 23 19:06:35.838548 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.838527 2576 scope.go:117] "RemoveContainer" containerID="c49905b093db1acadda4d374eb5fe7ffd4074865a9a7338e196da3ce75537a84" Apr 23 19:06:35.844989 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.844962 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr"] Apr 23 19:06:35.847578 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:35.847513 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-69f799fd49-2gjfr"] Apr 23 19:06:36.648099 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:36.648067 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" path="/var/lib/kubelet/pods/0fa536e4-594e-4565-af0d-c3e3372bde7e/volumes" Apr 23 19:06:48.463664 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.463628 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78"] Apr 23 19:06:48.464261 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.464179 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="tokenizer" Apr 23 19:06:48.464261 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.464199 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="tokenizer" Apr 23 19:06:48.464261 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.464226 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="storage-initializer" Apr 23 19:06:48.464261 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.464235 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="storage-initializer" Apr 23 19:06:48.464261 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.464246 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="main" Apr 23 19:06:48.464261 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.464254 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="main" Apr 23 19:06:48.464578 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.464352 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="tokenizer" Apr 23 19:06:48.464578 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.464365 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fa536e4-594e-4565-af0d-c3e3372bde7e" containerName="main" Apr 23 19:06:48.469187 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.469166 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.473126 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.473106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:06:48.473126 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.473109 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-fgw7b\"" Apr 23 19:06:48.473283 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.473109 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 23 19:06:48.481589 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.481562 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78"] Apr 23 19:06:48.581177 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.581136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.581352 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.581253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04f3b1-131e-4d1b-90cf-228596c99180-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.581352 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.581283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.581352 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.581320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.581352 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.581339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg7vh\" (UniqueName: \"kubernetes.io/projected/dc04f3b1-131e-4d1b-90cf-228596c99180-kube-api-access-rg7vh\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.581494 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.581387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.682704 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.682652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.682704 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.682698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg7vh\" (UniqueName: \"kubernetes.io/projected/dc04f3b1-131e-4d1b-90cf-228596c99180-kube-api-access-rg7vh\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.682988 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.682907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.683134 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.683110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.683213 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.683121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.683305 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.683287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.683370 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.683326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04f3b1-131e-4d1b-90cf-228596c99180-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.683370 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.683364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.683477 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.683363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.683749 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.683701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.685691 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.685669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04f3b1-131e-4d1b-90cf-228596c99180-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.691777 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.691754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg7vh\" (UniqueName: \"kubernetes.io/projected/dc04f3b1-131e-4d1b-90cf-228596c99180-kube-api-access-rg7vh\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:48.778630 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:48.778542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:49.120014 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:49.119984 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78"] Apr 23 19:06:49.120596 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:06:49.120571 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc04f3b1_131e_4d1b_90cf_228596c99180.slice/crio-bfd0c8820ed0058fb30be1212594d0511a2aed2f7969b474d5c7b49c34a20e16 WatchSource:0}: Error finding container bfd0c8820ed0058fb30be1212594d0511a2aed2f7969b474d5c7b49c34a20e16: Status 404 returned error can't find the container with id bfd0c8820ed0058fb30be1212594d0511a2aed2f7969b474d5c7b49c34a20e16 Apr 23 19:06:49.122447 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:49.122422 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 19:06:49.872941 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:49.872911 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerID="5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac" exitCode=0 Apr 23 19:06:49.873263 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:49.872990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" event={"ID":"dc04f3b1-131e-4d1b-90cf-228596c99180","Type":"ContainerDied","Data":"5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac"} Apr 23 19:06:49.873263 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:49.873026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" event={"ID":"dc04f3b1-131e-4d1b-90cf-228596c99180","Type":"ContainerStarted","Data":"bfd0c8820ed0058fb30be1212594d0511a2aed2f7969b474d5c7b49c34a20e16"} Apr 23 19:06:50.878874 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:50.878842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" event={"ID":"dc04f3b1-131e-4d1b-90cf-228596c99180","Type":"ContainerStarted","Data":"ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb"} Apr 23 19:06:50.878874 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:50.878878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" event={"ID":"dc04f3b1-131e-4d1b-90cf-228596c99180","Type":"ContainerStarted","Data":"7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac"} Apr 23 19:06:50.879436 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:50.879001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:50.909911 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:50.909855 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" podStartSLOduration=2.909838427 podStartE2EDuration="2.909838427s" podCreationTimestamp="2026-04-23 19:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:06:50.908302253 +0000 UTC m=+5118.838314900" watchObservedRunningTime="2026-04-23 19:06:50.909838427 +0000 UTC m=+5118.839851069" Apr 23 19:06:58.779321 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:58.779285 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:58.779799 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:58.779335 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:58.782257 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:58.782226 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:06:58.914145 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:06:58.914116 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:07:19.918379 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:07:19.918347 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:09:33.233249 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:33.233215 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78"] Apr 23 19:09:33.233672 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:33.233632 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="main" containerID="cri-o://7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac" gracePeriod=30 Apr 23 19:09:33.233751 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:33.233670 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="tokenizer" containerID="cri-o://ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb" gracePeriod=30 Apr 23 19:09:33.527343 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:33.527243 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerID="7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac" exitCode=0 Apr 23 19:09:33.527343 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:33.527284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" event={"ID":"dc04f3b1-131e-4d1b-90cf-228596c99180","Type":"ContainerDied","Data":"7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac"} Apr 23 19:09:34.488512 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.488489 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:09:34.534151 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.534119 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerID="ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb" exitCode=0 Apr 23 19:09:34.534309 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.534197 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" Apr 23 19:09:34.534309 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.534190 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" event={"ID":"dc04f3b1-131e-4d1b-90cf-228596c99180","Type":"ContainerDied","Data":"ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb"} Apr 23 19:09:34.534309 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.534303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78" event={"ID":"dc04f3b1-131e-4d1b-90cf-228596c99180","Type":"ContainerDied","Data":"bfd0c8820ed0058fb30be1212594d0511a2aed2f7969b474d5c7b49c34a20e16"} Apr 23 19:09:34.534444 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.534319 2576 scope.go:117] "RemoveContainer" containerID="ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb" Apr 23 19:09:34.543512 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.543492 2576 scope.go:117] "RemoveContainer" containerID="7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac" Apr 23 19:09:34.547540 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.547517 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04f3b1-131e-4d1b-90cf-228596c99180-tls-certs\") pod \"dc04f3b1-131e-4d1b-90cf-228596c99180\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " Apr 23 19:09:34.547659 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.547554 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-cache\") pod \"dc04f3b1-131e-4d1b-90cf-228596c99180\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " Apr 23 19:09:34.547659 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.547628 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-uds\") pod \"dc04f3b1-131e-4d1b-90cf-228596c99180\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " Apr 23 19:09:34.547797 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.547756 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg7vh\" (UniqueName: \"kubernetes.io/projected/dc04f3b1-131e-4d1b-90cf-228596c99180-kube-api-access-rg7vh\") pod \"dc04f3b1-131e-4d1b-90cf-228596c99180\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " Apr 23 19:09:34.547853 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.547796 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-kserve-provision-location\") pod \"dc04f3b1-131e-4d1b-90cf-228596c99180\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " Apr 23 19:09:34.547853 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.547833 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-tmp\") pod \"dc04f3b1-131e-4d1b-90cf-228596c99180\" (UID: \"dc04f3b1-131e-4d1b-90cf-228596c99180\") " Apr 23 19:09:34.547955 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.547888 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "dc04f3b1-131e-4d1b-90cf-228596c99180" (UID: "dc04f3b1-131e-4d1b-90cf-228596c99180"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:09:34.548040 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.547966 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "dc04f3b1-131e-4d1b-90cf-228596c99180" (UID: "dc04f3b1-131e-4d1b-90cf-228596c99180"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:09:34.548191 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.548170 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:09:34.548259 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.548197 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:09:34.548259 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.548218 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "dc04f3b1-131e-4d1b-90cf-228596c99180" (UID: "dc04f3b1-131e-4d1b-90cf-228596c99180"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:09:34.548566 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.548546 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dc04f3b1-131e-4d1b-90cf-228596c99180" (UID: "dc04f3b1-131e-4d1b-90cf-228596c99180"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:09:34.549823 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.549793 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc04f3b1-131e-4d1b-90cf-228596c99180-kube-api-access-rg7vh" (OuterVolumeSpecName: "kube-api-access-rg7vh") pod "dc04f3b1-131e-4d1b-90cf-228596c99180" (UID: "dc04f3b1-131e-4d1b-90cf-228596c99180"). InnerVolumeSpecName "kube-api-access-rg7vh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:09:34.550252 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.550234 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc04f3b1-131e-4d1b-90cf-228596c99180-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dc04f3b1-131e-4d1b-90cf-228596c99180" (UID: "dc04f3b1-131e-4d1b-90cf-228596c99180"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:09:34.552106 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.552090 2576 scope.go:117] "RemoveContainer" containerID="5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac" Apr 23 19:09:34.565458 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.565431 2576 scope.go:117] "RemoveContainer" containerID="ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb" Apr 23 19:09:34.565726 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:09:34.565691 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb\": container with ID starting with ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb not found: ID does not exist" containerID="ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb" Apr 23 19:09:34.565804 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.565740 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb"} err="failed to get container status \"ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb\": rpc error: code = NotFound desc = could not find container \"ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb\": container with ID starting with ef6bd58ce5b55fd7a0e6b4a6bcf94caeb9646341b72db3d37e90c548e942a8bb not found: ID does not exist" Apr 23 19:09:34.565804 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.565765 2576 scope.go:117] "RemoveContainer" containerID="7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac" Apr 23 19:09:34.565990 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:09:34.565973 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac\": container with ID starting with 7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac not found: ID does not exist" containerID="7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac" Apr 23 19:09:34.566035 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.565994 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac"} err="failed to get container status \"7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac\": rpc error: code = NotFound desc = could not find container \"7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac\": container with ID starting with 7d1b84b8ca5aebca4b6582720998e636beea89329daed26ab71a03a650ff80ac not found: ID does not exist" Apr 23 19:09:34.566035 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.566007 2576 scope.go:117] "RemoveContainer" containerID="5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac" Apr 23 19:09:34.566243 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:09:34.566226 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac\": container with ID starting with 5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac not found: ID does not exist" containerID="5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac" Apr 23 19:09:34.566291 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.566249 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac"} err="failed to get container status \"5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac\": rpc error: code = NotFound desc = could not find container \"5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac\": container with ID starting with 5f26e9405494168be1f95ba68795a86ad1043177b62ab20d009b7e507b7da5ac not found: ID does not exist" Apr 23 19:09:34.648733 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.648694 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:09:34.648894 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.648741 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc04f3b1-131e-4d1b-90cf-228596c99180-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:09:34.648894 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.648751 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rg7vh\" (UniqueName: \"kubernetes.io/projected/dc04f3b1-131e-4d1b-90cf-228596c99180-kube-api-access-rg7vh\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:09:34.648894 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.648762 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc04f3b1-131e-4d1b-90cf-228596c99180-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:09:34.856739 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.856687 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78"] Apr 23 19:09:34.864220 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:34.864190 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherzz78"] Apr 23 19:09:36.648122 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:09:36.648091 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" path="/var/lib/kubelet/pods/dc04f3b1-131e-4d1b-90cf-228596c99180/volumes" Apr 23 19:10:35.323454 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323413 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 19:10:35.324074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323802 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="main" Apr 23 19:10:35.324074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323814 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="main" Apr 23 19:10:35.324074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323823 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="tokenizer" Apr 23 19:10:35.324074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323829 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="tokenizer" Apr 23 19:10:35.324074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323846 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="storage-initializer" Apr 23 19:10:35.324074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323853 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="storage-initializer" Apr 23 19:10:35.324074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323913 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="tokenizer" Apr 23 19:10:35.324074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.323922 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc04f3b1-131e-4d1b-90cf-228596c99180" containerName="main" Apr 23 19:10:35.328258 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.328239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.332683 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.332662 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 23 19:10:35.332810 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.332661 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-kgc5v\"" Apr 23 19:10:35.332810 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.332663 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:10:35.340390 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.340366 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 19:10:35.476306 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.476268 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk"] Apr 23 19:10:35.480860 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.480837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.489629 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.489608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-t5r27\"" Apr 23 19:10:35.498314 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.498291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2eeae007-929e-4f4b-9c79-71d29762308b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.498452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.498323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.498452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.498352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.498452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.498435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtrl2\" (UniqueName: \"kubernetes.io/projected/2eeae007-929e-4f4b-9c79-71d29762308b-kube-api-access-rtrl2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.498563 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.498457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.498563 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.498484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.498563 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.498534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.505562 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.505534 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk"] Apr 23 19:10:35.599777 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.599661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.599777 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.599704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.599777 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.599756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.600028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.599785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.600028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.599879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtrl2\" (UniqueName: \"kubernetes.io/projected/2eeae007-929e-4f4b-9c79-71d29762308b-kube-api-access-rtrl2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.599926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.600028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.599956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600028 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.599993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600293 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600293 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/025528af-ca04-4e6f-b675-6a350285af5f-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.600293 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9j8q\" (UniqueName: \"kubernetes.io/projected/025528af-ca04-4e6f-b675-6a350285af5f-kube-api-access-w9j8q\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.600293 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600293 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2eeae007-929e-4f4b-9c79-71d29762308b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600293 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600293 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600530 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.600530 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.600453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.602045 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.602018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.602372 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.602356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2eeae007-929e-4f4b-9c79-71d29762308b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.611511 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.611488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtrl2\" (UniqueName: \"kubernetes.io/projected/2eeae007-929e-4f4b-9c79-71d29762308b-kube-api-access-rtrl2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.639065 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.639041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:10:35.702593 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.701531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.702593 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.702538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.704207 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.702778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/025528af-ca04-4e6f-b675-6a350285af5f-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.704207 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.702818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9j8q\" (UniqueName: \"kubernetes.io/projected/025528af-ca04-4e6f-b675-6a350285af5f-kube-api-access-w9j8q\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.704207 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.702904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.704207 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.702934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.704207 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.702975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.704207 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.703325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.704207 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.703409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.704207 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.703552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.709450 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.709422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/025528af-ca04-4e6f-b675-6a350285af5f-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.714107 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.714081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9j8q\" (UniqueName: \"kubernetes.io/projected/025528af-ca04-4e6f-b675-6a350285af5f-kube-api-access-w9j8q\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.779060 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.779035 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 19:10:35.781290 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:10:35.781265 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eeae007_929e_4f4b_9c79_71d29762308b.slice/crio-b2544d8c70428b32b9a2094886ce6ce172b86d6a2414fc6dc54a50327024af2a WatchSource:0}: Error finding container b2544d8c70428b32b9a2094886ce6ce172b86d6a2414fc6dc54a50327024af2a: Status 404 returned error can't find the container with id b2544d8c70428b32b9a2094886ce6ce172b86d6a2414fc6dc54a50327024af2a Apr 23 19:10:35.791349 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.791328 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:35.940129 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:35.940098 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk"] Apr 23 19:10:35.940223 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:10:35.940158 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod025528af_ca04_4e6f_b675_6a350285af5f.slice/crio-e2dd19ee80bb52d2a6ec063365f332993c51c282fa5101a0b9d7f1814d283626 WatchSource:0}: Error finding container e2dd19ee80bb52d2a6ec063365f332993c51c282fa5101a0b9d7f1814d283626: Status 404 returned error can't find the container with id e2dd19ee80bb52d2a6ec063365f332993c51c282fa5101a0b9d7f1814d283626 Apr 23 19:10:36.772583 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:36.772544 2576 generic.go:358] "Generic (PLEG): container finished" podID="025528af-ca04-4e6f-b675-6a350285af5f" containerID="41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5" exitCode=0 Apr 23 19:10:36.773037 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:36.772637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" event={"ID":"025528af-ca04-4e6f-b675-6a350285af5f","Type":"ContainerDied","Data":"41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5"} Apr 23 19:10:36.773037 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:36.772679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" event={"ID":"025528af-ca04-4e6f-b675-6a350285af5f","Type":"ContainerStarted","Data":"e2dd19ee80bb52d2a6ec063365f332993c51c282fa5101a0b9d7f1814d283626"} Apr 23 19:10:36.774542 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:36.774517 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"2eeae007-929e-4f4b-9c79-71d29762308b","Type":"ContainerStarted","Data":"c5c4db6a2da99e48ccf19e768233bd72f58d8c6ee39e372703a9737f778050db"} Apr 23 19:10:36.774640 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:36.774552 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"2eeae007-929e-4f4b-9c79-71d29762308b","Type":"ContainerStarted","Data":"b2544d8c70428b32b9a2094886ce6ce172b86d6a2414fc6dc54a50327024af2a"} Apr 23 19:10:37.781302 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:37.781258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" event={"ID":"025528af-ca04-4e6f-b675-6a350285af5f","Type":"ContainerStarted","Data":"922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041"} Apr 23 19:10:37.781302 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:37.781307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" event={"ID":"025528af-ca04-4e6f-b675-6a350285af5f","Type":"ContainerStarted","Data":"13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60"} Apr 23 19:10:37.781756 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:37.781499 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:37.810291 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:37.810235 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" podStartSLOduration=2.810217931 podStartE2EDuration="2.810217931s" podCreationTimestamp="2026-04-23 19:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:10:37.807289472 +0000 UTC m=+5345.737302145" watchObservedRunningTime="2026-04-23 19:10:37.810217931 +0000 UTC m=+5345.740230567" Apr 23 19:10:45.791524 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:45.791479 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:45.792098 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:45.791638 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:45.794469 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:45.794441 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:10:45.814025 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:10:45.814002 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:11:07.822128 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:07.822097 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:11:29.992819 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:29.992782 2576 generic.go:358] "Generic (PLEG): container finished" podID="2eeae007-929e-4f4b-9c79-71d29762308b" containerID="c5c4db6a2da99e48ccf19e768233bd72f58d8c6ee39e372703a9737f778050db" exitCode=0 Apr 23 19:11:29.993199 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:29.992858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"2eeae007-929e-4f4b-9c79-71d29762308b","Type":"ContainerDied","Data":"c5c4db6a2da99e48ccf19e768233bd72f58d8c6ee39e372703a9737f778050db"} Apr 23 19:11:33.056621 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:33.056517 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:11:33.061448 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:33.061423 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:11:33.072188 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:33.072165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:11:33.078462 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:33.078436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:11:58.138974 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:58.138940 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"2eeae007-929e-4f4b-9c79-71d29762308b","Type":"ContainerStarted","Data":"333403a7e9a3f0656a8c853d46468dc621a4453169fd55ff86f1d168c5813769"} Apr 23 19:11:58.162800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:11:58.162737 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=56.034435456 podStartE2EDuration="1m23.162700593s" podCreationTimestamp="2026-04-23 19:10:35 +0000 UTC" firstStartedPulling="2026-04-23 19:11:29.994104234 +0000 UTC m=+5397.924116852" lastFinishedPulling="2026-04-23 19:11:57.122369362 +0000 UTC m=+5425.052381989" observedRunningTime="2026-04-23 19:11:58.161327183 +0000 UTC m=+5426.091339821" watchObservedRunningTime="2026-04-23 19:11:58.162700593 +0000 UTC m=+5426.092713229" Apr 23 19:12:37.814184 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:37.814143 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk"] Apr 23 19:12:37.814800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:37.814437 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="main" containerID="cri-o://13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60" gracePeriod=30 Apr 23 19:12:37.814800 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:37.814491 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="tokenizer" containerID="cri-o://922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041" gracePeriod=30 Apr 23 19:12:37.821646 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:12:37.821621 2576 logging.go:55] [core] [Channel #2516 SubChannel #2517]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.53:9003", ServerName: "10.133.0.53:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.53:9003: connect: connection refused" Apr 23 19:12:38.314639 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:38.314604 2576 generic.go:358] "Generic (PLEG): container finished" podID="025528af-ca04-4e6f-b675-6a350285af5f" containerID="13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60" exitCode=0 Apr 23 19:12:38.314855 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:38.314690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" event={"ID":"025528af-ca04-4e6f-b675-6a350285af5f","Type":"ContainerDied","Data":"13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60"} Apr 23 19:12:38.821221 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:38.821178 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.53:9003\" within 1s: context deadline exceeded" Apr 23 19:12:39.087334 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.087307 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:12:39.175324 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175293 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-tmp\") pod \"025528af-ca04-4e6f-b675-6a350285af5f\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " Apr 23 19:12:39.175520 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175346 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-cache\") pod \"025528af-ca04-4e6f-b675-6a350285af5f\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " Apr 23 19:12:39.175520 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175367 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9j8q\" (UniqueName: \"kubernetes.io/projected/025528af-ca04-4e6f-b675-6a350285af5f-kube-api-access-w9j8q\") pod \"025528af-ca04-4e6f-b675-6a350285af5f\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " Apr 23 19:12:39.175520 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175419 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/025528af-ca04-4e6f-b675-6a350285af5f-tls-certs\") pod \"025528af-ca04-4e6f-b675-6a350285af5f\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " Apr 23 19:12:39.175520 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175490 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-uds\") pod \"025528af-ca04-4e6f-b675-6a350285af5f\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " Apr 23 19:12:39.175520 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175514 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-kserve-provision-location\") pod \"025528af-ca04-4e6f-b675-6a350285af5f\" (UID: \"025528af-ca04-4e6f-b675-6a350285af5f\") " Apr 23 19:12:39.175823 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175640 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "025528af-ca04-4e6f-b675-6a350285af5f" (UID: "025528af-ca04-4e6f-b675-6a350285af5f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:12:39.175823 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175677 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "025528af-ca04-4e6f-b675-6a350285af5f" (UID: "025528af-ca04-4e6f-b675-6a350285af5f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:12:39.175916 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175842 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "025528af-ca04-4e6f-b675-6a350285af5f" (UID: "025528af-ca04-4e6f-b675-6a350285af5f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:12:39.175916 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175874 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-tmp\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:12:39.175916 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.175895 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:12:39.176298 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.176275 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "025528af-ca04-4e6f-b675-6a350285af5f" (UID: "025528af-ca04-4e6f-b675-6a350285af5f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:12:39.177629 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.177606 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025528af-ca04-4e6f-b675-6a350285af5f-kube-api-access-w9j8q" (OuterVolumeSpecName: "kube-api-access-w9j8q") pod "025528af-ca04-4e6f-b675-6a350285af5f" (UID: "025528af-ca04-4e6f-b675-6a350285af5f"). InnerVolumeSpecName "kube-api-access-w9j8q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:12:39.177754 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.177658 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025528af-ca04-4e6f-b675-6a350285af5f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "025528af-ca04-4e6f-b675-6a350285af5f" (UID: "025528af-ca04-4e6f-b675-6a350285af5f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:12:39.276287 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.276243 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 19:12:39.276561 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.276519 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="2eeae007-929e-4f4b-9c79-71d29762308b" containerName="main" containerID="cri-o://333403a7e9a3f0656a8c853d46468dc621a4453169fd55ff86f1d168c5813769" gracePeriod=30 Apr 23 19:12:39.276736 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.276693 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-tokenizer-uds\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:12:39.276830 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.276739 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/025528af-ca04-4e6f-b675-6a350285af5f-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:12:39.276830 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.276755 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9j8q\" (UniqueName: \"kubernetes.io/projected/025528af-ca04-4e6f-b675-6a350285af5f-kube-api-access-w9j8q\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:12:39.276830 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.276768 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/025528af-ca04-4e6f-b675-6a350285af5f-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:12:39.320284 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.320250 2576 generic.go:358] "Generic (PLEG): container finished" podID="025528af-ca04-4e6f-b675-6a350285af5f" containerID="922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041" exitCode=0 Apr 23 19:12:39.320451 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.320333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" event={"ID":"025528af-ca04-4e6f-b675-6a350285af5f","Type":"ContainerDied","Data":"922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041"} Apr 23 19:12:39.320451 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.320348 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" Apr 23 19:12:39.320451 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.320373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk" event={"ID":"025528af-ca04-4e6f-b675-6a350285af5f","Type":"ContainerDied","Data":"e2dd19ee80bb52d2a6ec063365f332993c51c282fa5101a0b9d7f1814d283626"} Apr 23 19:12:39.320451 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.320391 2576 scope.go:117] "RemoveContainer" containerID="922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041" Apr 23 19:12:39.328999 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.328952 2576 scope.go:117] "RemoveContainer" containerID="13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60" Apr 23 19:12:39.337105 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.337086 2576 scope.go:117] "RemoveContainer" containerID="41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5" Apr 23 19:12:39.345793 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.345769 2576 scope.go:117] "RemoveContainer" containerID="922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041" Apr 23 19:12:39.346166 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:12:39.346130 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041\": container with ID starting with 922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041 not found: ID does not exist" containerID="922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041" Apr 23 19:12:39.346280 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.346172 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041"} err="failed to get container status \"922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041\": rpc error: code = NotFound desc = could not find container \"922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041\": container with ID starting with 922229b9822e04130946afa6693d167b7af85111fa0cfbc56b070a1baca77041 not found: ID does not exist" Apr 23 19:12:39.346280 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.346199 2576 scope.go:117] "RemoveContainer" containerID="13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60" Apr 23 19:12:39.346503 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:12:39.346474 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60\": container with ID starting with 13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60 not found: ID does not exist" containerID="13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60" Apr 23 19:12:39.346577 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.346502 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60"} err="failed to get container status \"13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60\": rpc error: code = NotFound desc = could not find container \"13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60\": container with ID starting with 13e45663b868a5e2fdb54f9cebe139713fd582972570bc3d2a90243977eb4c60 not found: ID does not exist" Apr 23 19:12:39.346577 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.346527 2576 scope.go:117] "RemoveContainer" containerID="41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5" Apr 23 19:12:39.346846 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:12:39.346824 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5\": container with ID starting with 41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5 not found: ID does not exist" containerID="41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5" Apr 23 19:12:39.347058 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.346853 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5"} err="failed to get container status \"41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5\": rpc error: code = NotFound desc = could not find container \"41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5\": container with ID starting with 41c30ca40caaa3de7726b15d299ddb837d734aa921a1089e3618bac4888e77e5 not found: ID does not exist" Apr 23 19:12:39.350080 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.350055 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk"] Apr 23 19:12:39.357560 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:39.357539 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedlwqk"] Apr 23 19:12:40.648257 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:12:40.648222 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025528af-ca04-4e6f-b675-6a350285af5f" path="/var/lib/kubelet/pods/025528af-ca04-4e6f-b675-6a350285af5f/volumes" Apr 23 19:13:09.442136 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.442108 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_2eeae007-929e-4f4b-9c79-71d29762308b/main/0.log" Apr 23 19:13:09.442562 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.442443 2576 generic.go:358] "Generic (PLEG): container finished" podID="2eeae007-929e-4f4b-9c79-71d29762308b" containerID="333403a7e9a3f0656a8c853d46468dc621a4453169fd55ff86f1d168c5813769" exitCode=137 Apr 23 19:13:09.442562 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.442476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"2eeae007-929e-4f4b-9c79-71d29762308b","Type":"ContainerDied","Data":"333403a7e9a3f0656a8c853d46468dc621a4453169fd55ff86f1d168c5813769"} Apr 23 19:13:09.927553 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.927528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_2eeae007-929e-4f4b-9c79-71d29762308b/main/0.log" Apr 23 19:13:09.927921 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.927905 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:13:09.950970 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.950941 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-tmp-dir\") pod \"2eeae007-929e-4f4b-9c79-71d29762308b\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " Apr 23 19:13:09.951109 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.950977 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-model-cache\") pod \"2eeae007-929e-4f4b-9c79-71d29762308b\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " Apr 23 19:13:09.951109 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951036 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-home\") pod \"2eeae007-929e-4f4b-9c79-71d29762308b\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " Apr 23 19:13:09.951241 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951108 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-kserve-provision-location\") pod \"2eeae007-929e-4f4b-9c79-71d29762308b\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " Apr 23 19:13:09.951241 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951160 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-dshm\") pod \"2eeae007-929e-4f4b-9c79-71d29762308b\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " Apr 23 19:13:09.951241 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2eeae007-929e-4f4b-9c79-71d29762308b-tls-certs\") pod \"2eeae007-929e-4f4b-9c79-71d29762308b\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " Apr 23 19:13:09.951241 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951228 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtrl2\" (UniqueName: \"kubernetes.io/projected/2eeae007-929e-4f4b-9c79-71d29762308b-kube-api-access-rtrl2\") pod \"2eeae007-929e-4f4b-9c79-71d29762308b\" (UID: \"2eeae007-929e-4f4b-9c79-71d29762308b\") " Apr 23 19:13:09.951490 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951350 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "2eeae007-929e-4f4b-9c79-71d29762308b" (UID: "2eeae007-929e-4f4b-9c79-71d29762308b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:09.951490 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951339 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-model-cache" (OuterVolumeSpecName: "model-cache") pod "2eeae007-929e-4f4b-9c79-71d29762308b" (UID: "2eeae007-929e-4f4b-9c79-71d29762308b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:09.951648 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951596 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-tmp-dir\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:13:09.951648 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951605 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-home" (OuterVolumeSpecName: "home") pod "2eeae007-929e-4f4b-9c79-71d29762308b" (UID: "2eeae007-929e-4f4b-9c79-71d29762308b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:09.951648 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.951617 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-model-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:13:09.954989 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.954914 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eeae007-929e-4f4b-9c79-71d29762308b-kube-api-access-rtrl2" (OuterVolumeSpecName: "kube-api-access-rtrl2") pod "2eeae007-929e-4f4b-9c79-71d29762308b" (UID: "2eeae007-929e-4f4b-9c79-71d29762308b"). InnerVolumeSpecName "kube-api-access-rtrl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:13:09.955533 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.955507 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-dshm" (OuterVolumeSpecName: "dshm") pod "2eeae007-929e-4f4b-9c79-71d29762308b" (UID: "2eeae007-929e-4f4b-9c79-71d29762308b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:09.956364 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:09.956338 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eeae007-929e-4f4b-9c79-71d29762308b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2eeae007-929e-4f4b-9c79-71d29762308b" (UID: "2eeae007-929e-4f4b-9c79-71d29762308b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:13:10.010796 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.010753 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2eeae007-929e-4f4b-9c79-71d29762308b" (UID: "2eeae007-929e-4f4b-9c79-71d29762308b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:13:10.052650 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.052559 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:13:10.052650 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.052589 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-dshm\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:13:10.052650 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.052603 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2eeae007-929e-4f4b-9c79-71d29762308b-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:13:10.052650 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.052617 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtrl2\" (UniqueName: \"kubernetes.io/projected/2eeae007-929e-4f4b-9c79-71d29762308b-kube-api-access-rtrl2\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:13:10.052650 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.052629 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2eeae007-929e-4f4b-9c79-71d29762308b-home\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:13:10.449294 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.449261 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_2eeae007-929e-4f4b-9c79-71d29762308b/main/0.log" Apr 23 19:13:10.449740 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.449635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"2eeae007-929e-4f4b-9c79-71d29762308b","Type":"ContainerDied","Data":"b2544d8c70428b32b9a2094886ce6ce172b86d6a2414fc6dc54a50327024af2a"} Apr 23 19:13:10.449740 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.449670 2576 scope.go:117] "RemoveContainer" containerID="333403a7e9a3f0656a8c853d46468dc621a4453169fd55ff86f1d168c5813769" Apr 23 19:13:10.449740 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.449686 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 19:13:10.459252 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.459230 2576 scope.go:117] "RemoveContainer" containerID="c5c4db6a2da99e48ccf19e768233bd72f58d8c6ee39e372703a9737f778050db" Apr 23 19:13:10.481774 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.481739 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 19:13:10.493034 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.493009 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 19:13:10.648979 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:10.648945 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eeae007-929e-4f4b-9c79-71d29762308b" path="/var/lib/kubelet/pods/2eeae007-929e-4f4b-9c79-71d29762308b/volumes" Apr 23 19:13:24.527850 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.527817 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn"] Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528235 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="storage-initializer" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528252 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="storage-initializer" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528267 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="main" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528275 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="main" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528287 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2eeae007-929e-4f4b-9c79-71d29762308b" containerName="main" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528295 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeae007-929e-4f4b-9c79-71d29762308b" containerName="main" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528315 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2eeae007-929e-4f4b-9c79-71d29762308b" containerName="storage-initializer" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528325 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeae007-929e-4f4b-9c79-71d29762308b" containerName="storage-initializer" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528349 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="tokenizer" Apr 23 19:13:24.528369 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528357 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="tokenizer" Apr 23 19:13:24.528984 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528457 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="main" Apr 23 19:13:24.528984 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528474 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="025528af-ca04-4e6f-b675-6a350285af5f" containerName="tokenizer" Apr 23 19:13:24.528984 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.528485 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2eeae007-929e-4f4b-9c79-71d29762308b" containerName="main" Apr 23 19:13:24.534783 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.534755 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.538547 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.538521 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-2gqdr\"" Apr 23 19:13:24.548155 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.548129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn"] Apr 23 19:13:24.583209 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/141c5630-1549-492f-a0e6-8f8db0cf749e-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.583396 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.583396 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xb2\" (UniqueName: \"kubernetes.io/projected/141c5630-1549-492f-a0e6-8f8db0cf749e-kube-api-access-s8xb2\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.583396 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.583396 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.583396 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.583396 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.583763 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.583763 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.583487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.683945 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.683906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.683945 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.683950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684224 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684224 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684224 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/141c5630-1549-492f-a0e6-8f8db0cf749e-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684224 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684224 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xb2\" (UniqueName: \"kubernetes.io/projected/141c5630-1549-492f-a0e6-8f8db0cf749e-kube-api-access-s8xb2\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684224 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684521 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684521 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684637 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684739 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.684811 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.684787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.685296 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.685271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/141c5630-1549-492f-a0e6-8f8db0cf749e-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.686733 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.686695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.686827 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.686761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.693586 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.693560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/141c5630-1549-492f-a0e6-8f8db0cf749e-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.694283 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.694261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xb2\" (UniqueName: \"kubernetes.io/projected/141c5630-1549-492f-a0e6-8f8db0cf749e-kube-api-access-s8xb2\") pod \"router-gateway-2-openshift-default-6866b85949-kqbbn\" (UID: \"141c5630-1549-492f-a0e6-8f8db0cf749e\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:24.849318 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:24.848820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:25.001760 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.001729 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn"] Apr 23 19:13:25.003809 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:13:25.003777 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141c5630_1549_492f_a0e6_8f8db0cf749e.slice/crio-0cf7a6615c80160c34f5016004d80a57ab3944ecb87fa76b6b8613b5fb0ac66c WatchSource:0}: Error finding container 0cf7a6615c80160c34f5016004d80a57ab3944ecb87fa76b6b8613b5fb0ac66c: Status 404 returned error can't find the container with id 0cf7a6615c80160c34f5016004d80a57ab3944ecb87fa76b6b8613b5fb0ac66c Apr 23 19:13:25.006080 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.006061 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 19:13:25.006414 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.006387 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 19:13:25.006522 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.006460 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 19:13:25.006522 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.006501 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 19:13:25.510013 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.509966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" event={"ID":"141c5630-1549-492f-a0e6-8f8db0cf749e","Type":"ContainerStarted","Data":"9c55bd18b61f04047066ee122ea2da2a93d33476ae42dd7e5190c0e0f8711937"} Apr 23 19:13:25.510013 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.510019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" event={"ID":"141c5630-1549-492f-a0e6-8f8db0cf749e","Type":"ContainerStarted","Data":"0cf7a6615c80160c34f5016004d80a57ab3944ecb87fa76b6b8613b5fb0ac66c"} Apr 23 19:13:25.535500 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.535449 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" podStartSLOduration=1.535434597 podStartE2EDuration="1.535434597s" podCreationTimestamp="2026-04-23 19:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:13:25.533621697 +0000 UTC m=+5513.463634334" watchObservedRunningTime="2026-04-23 19:13:25.535434597 +0000 UTC m=+5513.465447234" Apr 23 19:13:25.850062 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:25.850019 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:26.851376 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:26.851326 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" podUID="141c5630-1549-492f-a0e6-8f8db0cf749e" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.54:15021/healthz/ready\": dial tcp 10.133.0.54:15021: connect: connection refused" Apr 23 19:13:27.850041 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:27.849990 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" podUID="141c5630-1549-492f-a0e6-8f8db0cf749e" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.54:15021/healthz/ready\": dial tcp 10.133.0.54:15021: connect: connection refused" Apr 23 19:13:28.853907 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:28.853873 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:28.854288 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:28.854142 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:28.854821 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:28.854804 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-kqbbn" Apr 23 19:13:37.863415 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.863371 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm"] Apr 23 19:13:37.866789 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.866770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:37.872289 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.872262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-78dl4\"" Apr 23 19:13:37.872558 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.872540 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 23 19:13:37.872709 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.872685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-wgh6t\"" Apr 23 19:13:37.884790 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.884763 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm"] Apr 23 19:13:37.929439 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.929401 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk"] Apr 23 19:13:37.932195 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.932176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:37.953915 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:37.953881 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk"] Apr 23 19:13:38.003537 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.003737 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.003737 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.003737 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d337d849-33a6-4aee-8ee0-47e717c5abc9-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.003737 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r4fg\" (UniqueName: \"kubernetes.io/projected/d337d849-33a6-4aee-8ee0-47e717c5abc9-kube-api-access-9r4fg\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.003737 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.003737 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-model-cache\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.003979 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7877468-a2da-4b90-81a0-28f69dbde278-tls-certs\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.003979 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tqf\" (UniqueName: \"kubernetes.io/projected/d7877468-a2da-4b90-81a0-28f69dbde278-kube-api-access-k7tqf\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.003979 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-tmp-dir\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.003979 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-home\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.003979 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-dshm\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.003979 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.003979 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.003934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-home\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.104741 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.104685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-model-cache\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.105431 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.105408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7877468-a2da-4b90-81a0-28f69dbde278-tls-certs\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.105595 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.105580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tqf\" (UniqueName: \"kubernetes.io/projected/d7877468-a2da-4b90-81a0-28f69dbde278-kube-api-access-k7tqf\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.105709 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.105692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-tmp-dir\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.105882 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.105864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-home\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.105971 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.105896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-dshm\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.105971 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.105920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.105971 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.105354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-model-cache\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.106138 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.105989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-home\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106138 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106138 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-tmp-dir\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.106307 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106307 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.106307 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-home\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.106470 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106470 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d337d849-33a6-4aee-8ee0-47e717c5abc9-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106470 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r4fg\" (UniqueName: \"kubernetes.io/projected/d337d849-33a6-4aee-8ee0-47e717c5abc9-kube-api-access-9r4fg\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106470 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106679 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106679 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106497 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106679 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-home\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.106876 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.106828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.108506 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.108478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-dshm\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.108863 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.108839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7877468-a2da-4b90-81a0-28f69dbde278-tls-certs\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.108963 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.108932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.109108 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.109092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d337d849-33a6-4aee-8ee0-47e717c5abc9-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.120001 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.119947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r4fg\" (UniqueName: \"kubernetes.io/projected/d337d849-33a6-4aee-8ee0-47e717c5abc9-kube-api-access-9r4fg\") pod \"router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.120360 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.120341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tqf\" (UniqueName: \"kubernetes.io/projected/d7877468-a2da-4b90-81a0-28f69dbde278-kube-api-access-k7tqf\") pod \"router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.176556 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.176517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:38.243298 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.243267 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:38.320935 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.320906 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm"] Apr 23 19:13:38.321918 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:13:38.321892 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7877468_a2da_4b90_81a0_28f69dbde278.slice/crio-41e543a34770d12b5902444c6f62f7e6a63bde71a515ae3e1d7b9582fc11391d WatchSource:0}: Error finding container 41e543a34770d12b5902444c6f62f7e6a63bde71a515ae3e1d7b9582fc11391d: Status 404 returned error can't find the container with id 41e543a34770d12b5902444c6f62f7e6a63bde71a515ae3e1d7b9582fc11391d Apr 23 19:13:38.390958 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.390869 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk"] Apr 23 19:13:38.395426 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:13:38.395394 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd337d849_33a6_4aee_8ee0_47e717c5abc9.slice/crio-ee6144ac45398327ffeadbf2461d2d4134fa36d4448066d8b57357592d902e2e WatchSource:0}: Error finding container ee6144ac45398327ffeadbf2461d2d4134fa36d4448066d8b57357592d902e2e: Status 404 returned error can't find the container with id ee6144ac45398327ffeadbf2461d2d4134fa36d4448066d8b57357592d902e2e Apr 23 19:13:38.564675 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.564615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" event={"ID":"d337d849-33a6-4aee-8ee0-47e717c5abc9","Type":"ContainerStarted","Data":"3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c"} Apr 23 19:13:38.564675 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.564662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" event={"ID":"d337d849-33a6-4aee-8ee0-47e717c5abc9","Type":"ContainerStarted","Data":"ee6144ac45398327ffeadbf2461d2d4134fa36d4448066d8b57357592d902e2e"} Apr 23 19:13:38.565911 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:38.565881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" event={"ID":"d7877468-a2da-4b90-81a0-28f69dbde278","Type":"ContainerStarted","Data":"41e543a34770d12b5902444c6f62f7e6a63bde71a515ae3e1d7b9582fc11391d"} Apr 23 19:13:39.572513 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:39.572469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" event={"ID":"d7877468-a2da-4b90-81a0-28f69dbde278","Type":"ContainerStarted","Data":"390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7"} Apr 23 19:13:39.573001 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:39.572529 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:40.580401 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:40.580361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" event={"ID":"d7877468-a2da-4b90-81a0-28f69dbde278","Type":"ContainerStarted","Data":"870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc"} Apr 23 19:13:43.601109 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:43.601071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" event={"ID":"d337d849-33a6-4aee-8ee0-47e717c5abc9","Type":"ContainerDied","Data":"3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c"} Apr 23 19:13:43.601536 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:43.601186 2576 generic.go:358] "Generic (PLEG): container finished" podID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerID="3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c" exitCode=0 Apr 23 19:13:44.607450 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:44.607408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" event={"ID":"d337d849-33a6-4aee-8ee0-47e717c5abc9","Type":"ContainerStarted","Data":"5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df"} Apr 23 19:13:44.609074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:44.609044 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7877468-a2da-4b90-81a0-28f69dbde278" containerID="870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc" exitCode=0 Apr 23 19:13:44.609074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:44.609064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" event={"ID":"d7877468-a2da-4b90-81a0-28f69dbde278","Type":"ContainerDied","Data":"870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc"} Apr 23 19:13:44.635320 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:44.635258 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podStartSLOduration=7.63524181 podStartE2EDuration="7.63524181s" podCreationTimestamp="2026-04-23 19:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:13:44.632763809 +0000 UTC m=+5532.562776446" watchObservedRunningTime="2026-04-23 19:13:44.63524181 +0000 UTC m=+5532.565254444" Apr 23 19:13:45.616179 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:45.616140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" event={"ID":"d7877468-a2da-4b90-81a0-28f69dbde278","Type":"ContainerStarted","Data":"ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc"} Apr 23 19:13:45.644703 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:45.644636 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podStartSLOduration=7.716352851 podStartE2EDuration="8.64461607s" podCreationTimestamp="2026-04-23 19:13:37 +0000 UTC" firstStartedPulling="2026-04-23 19:13:38.323811136 +0000 UTC m=+5526.253823750" lastFinishedPulling="2026-04-23 19:13:39.252074342 +0000 UTC m=+5527.182086969" observedRunningTime="2026-04-23 19:13:45.644211633 +0000 UTC m=+5533.574224321" watchObservedRunningTime="2026-04-23 19:13:45.64461607 +0000 UTC m=+5533.574628707" Apr 23 19:13:48.177337 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:48.177283 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:48.177337 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:48.177333 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:48.178782 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:48.178751 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:13:48.244030 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:48.243984 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:48.244030 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:48.244028 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:13:48.245561 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:48.245528 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:13:58.177031 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:58.176974 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:13:58.196437 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:58.196406 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:13:58.244059 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:13:58.244007 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:14:08.177319 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:08.177261 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:14:08.244038 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:08.243994 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:14:18.177803 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:18.177743 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:14:18.243713 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:18.243672 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:14:28.177830 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:28.177704 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:14:28.244591 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:28.244552 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:14:38.177601 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:38.177551 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:14:38.244531 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:38.244488 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:14:48.177589 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:48.177539 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:14:48.244555 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:48.244505 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:14:58.177865 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:58.177799 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:14:58.244018 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:14:58.243976 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:15:08.177438 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:08.177390 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:15:08.244737 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:08.244671 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:15:18.177329 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:18.177272 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:15:18.243847 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:18.243800 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:15:28.176950 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:28.176885 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:15:28.244105 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:28.244056 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:15:38.177474 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:38.177423 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:15:38.243859 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:38.243819 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:15:48.177931 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:48.177882 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:15:48.243871 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:48.243828 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:15:58.177678 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:58.177562 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" probeResult="failure" output="Get \"https://10.133.0.55:8001/health\": dial tcp 10.133.0.55:8001: connect: connection refused" Apr 23 19:15:58.243883 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:15:58.243842 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" probeResult="failure" output="Get \"https://10.133.0.56:8000/health\": dial tcp 10.133.0.56:8000: connect: connection refused" Apr 23 19:16:08.187060 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:08.187027 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:16:08.205934 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:08.205905 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:16:08.254481 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:08.254451 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:16:08.262789 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:08.262762 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:16:20.209773 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:20.209732 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk"] Apr 23 19:16:20.210267 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:20.210118 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" containerID="cri-o://5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df" gracePeriod=30 Apr 23 19:16:20.212536 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:20.212499 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm"] Apr 23 19:16:20.212902 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:20.212842 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" containerID="cri-o://ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc" gracePeriod=30 Apr 23 19:16:33.090383 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:33.090354 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:16:33.094657 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:33.094634 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:16:33.117388 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:33.117358 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:16:33.120857 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:33.120835 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:16:35.009261 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:35.009230 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:35.042790 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:35.042760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:35.065757 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:35.065707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:35.075210 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:35.075179 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:35.087559 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:35.087532 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:35.109511 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:35.109486 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:35.120464 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:35.120439 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:36.097906 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:36.097879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:36.115610 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:36.115577 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:36.137474 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:36.137447 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:36.145457 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:36.145434 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:36.159016 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:36.158991 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:36.180826 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:36.180802 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:36.191904 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:36.191882 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:37.186083 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:37.186055 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:37.203765 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:37.203738 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:37.225579 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:37.225546 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:37.235865 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:37.235841 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:37.248174 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:37.248147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:37.267468 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:37.267444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:37.278198 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:37.278178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:38.260639 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:38.260608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:38.278543 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:38.278515 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:38.305033 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:38.305005 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:38.355190 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:38.355154 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:38.386945 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:38.386917 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:38.437168 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:38.437137 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:38.461744 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:38.461689 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:39.468690 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:39.468660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:39.486924 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:39.486895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:39.508857 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:39.508834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:39.518337 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:39.518318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:39.531915 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:39.531893 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:39.550413 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:39.550390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:39.560487 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:39.560468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:40.542654 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:40.542627 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:40.559900 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:40.559871 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:40.583671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:40.583647 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:40.591520 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:40.591494 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:40.602925 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:40.602903 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:40.626449 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:40.626425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:40.636030 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:40.636012 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:41.624020 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:41.623985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:41.655817 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:41.655788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:41.678583 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:41.678547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:41.686869 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:41.686845 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:41.699969 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:41.699948 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:41.719525 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:41.719502 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:41.733560 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:41.733542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:42.719680 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:42.719654 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:42.738018 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:42.737993 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:42.761826 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:42.761800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:42.769530 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:42.769507 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:42.780802 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:42.780783 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:42.799057 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:42.799033 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:42.811446 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:42.811426 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:43.844707 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:43.844677 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:43.863205 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:43.863174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:43.884948 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:43.884920 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:43.893573 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:43.893546 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:43.905814 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:43.905792 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:43.929007 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:43.928978 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:43.938815 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:43.938791 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:44.905557 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:44.905527 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:44.923008 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:44.922983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:44.947663 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:44.947635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:44.955811 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:44.955789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:44.967124 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:44.967101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:44.987109 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:44.987089 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:44.999830 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:44.999807 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:45.971402 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:45.971371 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:45.987647 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:45.987620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:46.028318 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:46.028290 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:46.054388 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:46.054356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:46.090986 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:46.090901 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:46.111017 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:46.110976 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:46.120958 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:46.120931 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:47.103625 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:47.103593 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:47.127199 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:47.127176 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:47.149553 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:47.149523 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:47.160019 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:47.159995 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:47.172912 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:47.172884 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:47.191796 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:47.191766 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:47.208379 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:47.208357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:48.183318 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:48.183291 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:48.199466 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:48.199434 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:48.225375 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:48.225346 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:48.233907 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:48.233868 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:48.245222 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:48.245201 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:48.263928 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:48.263897 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:48.272798 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:48.272779 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:49.233889 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:49.233859 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-mqqmd_6e81f2a0-2be6-48bc-8d7e-e039827fbfd7/istio-proxy/0.log" Apr 23 19:16:49.253094 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:49.253067 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-kqbbn_141c5630-1549-492f-a0e6-8f8db0cf749e/istio-proxy/0.log" Apr 23 19:16:49.275122 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:49.275098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:49.283587 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:49.283562 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/llm-d-routing-sidecar/0.log" Apr 23 19:16:49.295841 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:49.295822 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/storage-initializer/0.log" Apr 23 19:16:49.314463 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:49.314435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/main/0.log" Apr 23 19:16:49.323911 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:49.323895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk_d337d849-33a6-4aee-8ee0-47e717c5abc9/storage-initializer/0.log" Apr 23 19:16:50.213127 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.213085 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="llm-d-routing-sidecar" containerID="cri-o://390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7" gracePeriod=2 Apr 23 19:16:50.321411 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.321371 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fjvbp_def6ef66-30a7-4f7f-a2d2-1f9921020679/discovery/0.log" Apr 23 19:16:50.334758 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.334712 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-j64f2_f2afacd4-04d7-4d25-aa31-0e507e733b70/istio-proxy/0.log" Apr 23 19:16:50.420004 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.419971 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7877468-a2da-4b90-81a0-28f69dbde278" containerID="390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7" exitCode=0 Apr 23 19:16:50.420176 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.420039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" event={"ID":"d7877468-a2da-4b90-81a0-28f69dbde278","Type":"ContainerDied","Data":"390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7"} Apr 23 19:16:50.617965 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.617943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:50.618618 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.618597 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:16:50.621507 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.621490 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:16:50.762076 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.761985 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-kserve-provision-location\") pod \"d337d849-33a6-4aee-8ee0-47e717c5abc9\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " Apr 23 19:16:50.762076 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762021 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-model-cache\") pod \"d7877468-a2da-4b90-81a0-28f69dbde278\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " Apr 23 19:16:50.762076 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762039 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7877468-a2da-4b90-81a0-28f69dbde278-tls-certs\") pod \"d7877468-a2da-4b90-81a0-28f69dbde278\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " Apr 23 19:16:50.762076 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762075 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-tmp-dir\") pod \"d7877468-a2da-4b90-81a0-28f69dbde278\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762127 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r4fg\" (UniqueName: \"kubernetes.io/projected/d337d849-33a6-4aee-8ee0-47e717c5abc9-kube-api-access-9r4fg\") pod \"d337d849-33a6-4aee-8ee0-47e717c5abc9\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762159 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-kserve-provision-location\") pod \"d7877468-a2da-4b90-81a0-28f69dbde278\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762185 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-dshm\") pod \"d7877468-a2da-4b90-81a0-28f69dbde278\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762241 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-model-cache\") pod \"d337d849-33a6-4aee-8ee0-47e717c5abc9\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762268 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-tmp-dir\") pod \"d337d849-33a6-4aee-8ee0-47e717c5abc9\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762311 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-dshm\") pod \"d337d849-33a6-4aee-8ee0-47e717c5abc9\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762330 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-model-cache" (OuterVolumeSpecName: "model-cache") pod "d7877468-a2da-4b90-81a0-28f69dbde278" (UID: "d7877468-a2da-4b90-81a0-28f69dbde278"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762334 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-home\") pod \"d337d849-33a6-4aee-8ee0-47e717c5abc9\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " Apr 23 19:16:50.762452 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762433 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7tqf\" (UniqueName: \"kubernetes.io/projected/d7877468-a2da-4b90-81a0-28f69dbde278-kube-api-access-k7tqf\") pod \"d7877468-a2da-4b90-81a0-28f69dbde278\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " Apr 23 19:16:50.762955 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762507 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d337d849-33a6-4aee-8ee0-47e717c5abc9-tls-certs\") pod \"d337d849-33a6-4aee-8ee0-47e717c5abc9\" (UID: \"d337d849-33a6-4aee-8ee0-47e717c5abc9\") " Apr 23 19:16:50.762955 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762536 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-home\") pod \"d7877468-a2da-4b90-81a0-28f69dbde278\" (UID: \"d7877468-a2da-4b90-81a0-28f69dbde278\") " Apr 23 19:16:50.762955 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762562 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-model-cache" (OuterVolumeSpecName: "model-cache") pod "d337d849-33a6-4aee-8ee0-47e717c5abc9" (UID: "d337d849-33a6-4aee-8ee0-47e717c5abc9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.762955 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762868 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-model-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.762955 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.762886 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-model-cache\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.763215 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.763186 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-home" (OuterVolumeSpecName: "home") pod "d337d849-33a6-4aee-8ee0-47e717c5abc9" (UID: "d337d849-33a6-4aee-8ee0-47e717c5abc9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.764832 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.764802 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7877468-a2da-4b90-81a0-28f69dbde278-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d7877468-a2da-4b90-81a0-28f69dbde278" (UID: "d7877468-a2da-4b90-81a0-28f69dbde278"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:16:50.764970 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.764836 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d337d849-33a6-4aee-8ee0-47e717c5abc9-kube-api-access-9r4fg" (OuterVolumeSpecName: "kube-api-access-9r4fg") pod "d337d849-33a6-4aee-8ee0-47e717c5abc9" (UID: "d337d849-33a6-4aee-8ee0-47e717c5abc9"). InnerVolumeSpecName "kube-api-access-9r4fg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:16:50.765796 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.765739 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-home" (OuterVolumeSpecName: "home") pod "d7877468-a2da-4b90-81a0-28f69dbde278" (UID: "d7877468-a2da-4b90-81a0-28f69dbde278"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.765961 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.765806 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7877468-a2da-4b90-81a0-28f69dbde278-kube-api-access-k7tqf" (OuterVolumeSpecName: "kube-api-access-k7tqf") pod "d7877468-a2da-4b90-81a0-28f69dbde278" (UID: "d7877468-a2da-4b90-81a0-28f69dbde278"). InnerVolumeSpecName "kube-api-access-k7tqf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 19:16:50.766115 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.766095 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-dshm" (OuterVolumeSpecName: "dshm") pod "d7877468-a2da-4b90-81a0-28f69dbde278" (UID: "d7877468-a2da-4b90-81a0-28f69dbde278"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.767430 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.767407 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d337d849-33a6-4aee-8ee0-47e717c5abc9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d337d849-33a6-4aee-8ee0-47e717c5abc9" (UID: "d337d849-33a6-4aee-8ee0-47e717c5abc9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 19:16:50.767739 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.767695 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-dshm" (OuterVolumeSpecName: "dshm") pod "d337d849-33a6-4aee-8ee0-47e717c5abc9" (UID: "d337d849-33a6-4aee-8ee0-47e717c5abc9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.777214 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.777192 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "d7877468-a2da-4b90-81a0-28f69dbde278" (UID: "d7877468-a2da-4b90-81a0-28f69dbde278"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.783180 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.783152 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "d337d849-33a6-4aee-8ee0-47e717c5abc9" (UID: "d337d849-33a6-4aee-8ee0-47e717c5abc9"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.798503 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.798465 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d7877468-a2da-4b90-81a0-28f69dbde278" (UID: "d7877468-a2da-4b90-81a0-28f69dbde278"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.798799 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.798778 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d337d849-33a6-4aee-8ee0-47e717c5abc9" (UID: "d337d849-33a6-4aee-8ee0-47e717c5abc9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 19:16:50.863618 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863582 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k7tqf\" (UniqueName: \"kubernetes.io/projected/d7877468-a2da-4b90-81a0-28f69dbde278-kube-api-access-k7tqf\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863618 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863616 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d337d849-33a6-4aee-8ee0-47e717c5abc9-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863618 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863626 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-home\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863635 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863644 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7877468-a2da-4b90-81a0-28f69dbde278-tls-certs\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863655 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-tmp-dir\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863665 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9r4fg\" (UniqueName: \"kubernetes.io/projected/d337d849-33a6-4aee-8ee0-47e717c5abc9-kube-api-access-9r4fg\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863674 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-kserve-provision-location\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863682 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7877468-a2da-4b90-81a0-28f69dbde278-dshm\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863691 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-tmp-dir\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863699 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-dshm\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:50.863905 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:50.863706 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d337d849-33a6-4aee-8ee0-47e717c5abc9-home\") on node \"ip-10-0-138-68.ec2.internal\" DevicePath \"\"" Apr 23 19:16:51.214412 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.214368 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fjvbp_def6ef66-30a7-4f7f-a2d2-1f9921020679/discovery/0.log" Apr 23 19:16:51.228208 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.227879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-j64f2_f2afacd4-04d7-4d25-aa31-0e507e733b70/istio-proxy/0.log" Apr 23 19:16:51.425678 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.425644 2576 generic.go:358] "Generic (PLEG): container finished" podID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerID="5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df" exitCode=137 Apr 23 19:16:51.426155 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.425741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" event={"ID":"d337d849-33a6-4aee-8ee0-47e717c5abc9","Type":"ContainerDied","Data":"5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df"} Apr 23 19:16:51.426155 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.425758 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" Apr 23 19:16:51.426155 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.425787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk" event={"ID":"d337d849-33a6-4aee-8ee0-47e717c5abc9","Type":"ContainerDied","Data":"ee6144ac45398327ffeadbf2461d2d4134fa36d4448066d8b57357592d902e2e"} Apr 23 19:16:51.426155 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.425808 2576 scope.go:117] "RemoveContainer" containerID="5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df" Apr 23 19:16:51.427237 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.427219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm_d7877468-a2da-4b90-81a0-28f69dbde278/main/0.log" Apr 23 19:16:51.427875 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.427852 2576 generic.go:358] "Generic (PLEG): container finished" podID="d7877468-a2da-4b90-81a0-28f69dbde278" containerID="ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc" exitCode=137 Apr 23 19:16:51.427988 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.427935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" event={"ID":"d7877468-a2da-4b90-81a0-28f69dbde278","Type":"ContainerDied","Data":"ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc"} Apr 23 19:16:51.427988 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.427956 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" Apr 23 19:16:51.427988 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.427981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm" event={"ID":"d7877468-a2da-4b90-81a0-28f69dbde278","Type":"ContainerDied","Data":"41e543a34770d12b5902444c6f62f7e6a63bde71a515ae3e1d7b9582fc11391d"} Apr 23 19:16:51.435894 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.435858 2576 scope.go:117] "RemoveContainer" containerID="3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c" Apr 23 19:16:51.450489 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.450470 2576 scope.go:117] "RemoveContainer" containerID="5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df" Apr 23 19:16:51.450780 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:16:51.450759 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df\": container with ID starting with 5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df not found: ID does not exist" containerID="5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df" Apr 23 19:16:51.450848 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.450791 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df"} err="failed to get container status \"5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df\": rpc error: code = NotFound desc = could not find container \"5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df\": container with ID starting with 5398d69de6358c32c9a0e186cac412944a4ea2d5dce123e9e65f35c8d86de5df not found: ID does not exist" Apr 23 19:16:51.450848 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.450808 2576 scope.go:117] "RemoveContainer" containerID="3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c" Apr 23 19:16:51.451062 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:16:51.451044 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c\": container with ID starting with 3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c not found: ID does not exist" containerID="3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c" Apr 23 19:16:51.451124 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.451074 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c"} err="failed to get container status \"3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c\": rpc error: code = NotFound desc = could not find container \"3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c\": container with ID starting with 3dc8bb78b1b3398defb94e62e000343da991f19559dc798c43b60d9b595a2b1c not found: ID does not exist" Apr 23 19:16:51.451124 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.451097 2576 scope.go:117] "RemoveContainer" containerID="ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc" Apr 23 19:16:51.455914 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.455891 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk"] Apr 23 19:16:51.459283 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.459264 2576 scope.go:117] "RemoveContainer" containerID="870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc" Apr 23 19:16:51.463690 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.463664 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-7d9f7645f-glgwk"] Apr 23 19:16:51.470294 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.470275 2576 scope.go:117] "RemoveContainer" containerID="390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7" Apr 23 19:16:51.478257 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.478238 2576 scope.go:117] "RemoveContainer" containerID="ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc" Apr 23 19:16:51.478575 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:16:51.478542 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc\": container with ID starting with ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc not found: ID does not exist" containerID="ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc" Apr 23 19:16:51.478712 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.478579 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc"} err="failed to get container status \"ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc\": rpc error: code = NotFound desc = could not find container \"ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc\": container with ID starting with ce9d03b6e45b028777c013a0b5d7cf74d94296af53a13bf84fb2845ce62245dc not found: ID does not exist" Apr 23 19:16:51.478712 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.478600 2576 scope.go:117] "RemoveContainer" containerID="870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc" Apr 23 19:16:51.478934 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:16:51.478914 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc\": container with ID starting with 870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc not found: ID does not exist" containerID="870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc" Apr 23 19:16:51.478993 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.478942 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc"} err="failed to get container status \"870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc\": rpc error: code = NotFound desc = could not find container \"870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc\": container with ID starting with 870e324e46547b84f036ec5b3103dbc68c42433c0ee391348695e8babb39d2fc not found: ID does not exist" Apr 23 19:16:51.478993 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.478963 2576 scope.go:117] "RemoveContainer" containerID="390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7" Apr 23 19:16:51.479399 ip-10-0-138-68 kubenswrapper[2576]: E0423 19:16:51.479374 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7\": container with ID starting with 390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7 not found: ID does not exist" containerID="390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7" Apr 23 19:16:51.479492 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.479405 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7"} err="failed to get container status \"390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7\": rpc error: code = NotFound desc = could not find container \"390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7\": container with ID starting with 390651d8b1b76119f64dc8c60fda79c28a171d1884931c320d3d259318d880b7 not found: ID does not exist" Apr 23 19:16:51.482536 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.482514 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm"] Apr 23 19:16:51.488229 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:51.488195 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-d9f6bb5b8-pckhm"] Apr 23 19:16:52.089697 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:52.089667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-hnn8l_7e2d9c03-3a31-4a7d-89b2-91f44d29bc91/manager/0.log" Apr 23 19:16:52.239515 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:52.239488 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-l5476_3ea44f52-5a26-411a-83ea-6feb5e67c9fb/manager/0.log" Apr 23 19:16:52.649074 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:52.649042 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" path="/var/lib/kubelet/pods/d337d849-33a6-4aee-8ee0-47e717c5abc9/volumes" Apr 23 19:16:52.649492 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:52.649479 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" path="/var/lib/kubelet/pods/d7877468-a2da-4b90-81a0-28f69dbde278/volumes" Apr 23 19:16:57.673604 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:57.673567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2kw6v_0279be05-6d6b-46d3-9a5d-97af3972be80/global-pull-secret-syncer/0.log" Apr 23 19:16:57.867630 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:57.867595 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j6hxm_4a1cf606-e60a-4909-8878-950353a863cc/konnectivity-agent/0.log" Apr 23 19:16:57.975366 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:16:57.975288 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-68.ec2.internal_9475ce23d467a37e0480df7597bbc574/haproxy/0.log" Apr 23 19:17:01.653957 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:01.653920 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-hnn8l_7e2d9c03-3a31-4a7d-89b2-91f44d29bc91/manager/0.log" Apr 23 19:17:01.824923 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:01.824898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-l5476_3ea44f52-5a26-411a-83ea-6feb5e67c9fb/manager/0.log" Apr 23 19:17:03.224363 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:03.224318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vn6j_de127a70-49dc-4497-bc07-40fa5216e03b/kube-state-metrics/0.log" Apr 23 19:17:03.245669 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:03.245643 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vn6j_de127a70-49dc-4497-bc07-40fa5216e03b/kube-rbac-proxy-main/0.log" Apr 23 19:17:03.268352 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:03.268326 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vn6j_de127a70-49dc-4497-bc07-40fa5216e03b/kube-rbac-proxy-self/0.log" Apr 23 19:17:03.298591 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:03.298567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-56c8457455-7bjzd_d418b34f-88f5-4b79-88db-a4f7534d1469/metrics-server/0.log" Apr 23 19:17:03.539555 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:03.539475 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-swxxb_f075f196-9caf-4281-83b3-edf93558d8f7/node-exporter/0.log" Apr 23 19:17:03.565223 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:03.565195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-swxxb_f075f196-9caf-4281-83b3-edf93558d8f7/kube-rbac-proxy/0.log" Apr 23 19:17:03.589654 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:03.589630 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-swxxb_f075f196-9caf-4281-83b3-edf93558d8f7/init-textfile/0.log" Apr 23 19:17:03.998474 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:03.998442 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-2jncb_4d41989e-e48d-451f-8814-4ab5c5096935/prometheus-operator-admission-webhook/0.log" Apr 23 19:17:04.145650 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:04.145561 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd7c4cf98-6qxq2_2cc3bd63-d168-4a51-b334-72c496741f86/thanos-query/0.log" Apr 23 19:17:04.170768 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:04.170708 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd7c4cf98-6qxq2_2cc3bd63-d168-4a51-b334-72c496741f86/kube-rbac-proxy-web/0.log" Apr 23 19:17:04.199026 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:04.198995 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd7c4cf98-6qxq2_2cc3bd63-d168-4a51-b334-72c496741f86/kube-rbac-proxy/0.log" Apr 23 19:17:04.241853 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:04.241823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd7c4cf98-6qxq2_2cc3bd63-d168-4a51-b334-72c496741f86/prom-label-proxy/0.log" Apr 23 19:17:04.270219 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:04.270194 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd7c4cf98-6qxq2_2cc3bd63-d168-4a51-b334-72c496741f86/kube-rbac-proxy-rules/0.log" Apr 23 19:17:04.296833 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:04.296808 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5cd7c4cf98-6qxq2_2cc3bd63-d168-4a51-b334-72c496741f86/kube-rbac-proxy-metrics/0.log" Apr 23 19:17:06.065613 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.065588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/1.log" Apr 23 19:17:06.073990 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.073961 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5nzks_79f1dda8-3f2e-4cdb-99aa-0b76c1a17dd8/console-operator/2.log" Apr 23 19:17:06.178024 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.177992 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62"] Apr 23 19:17:06.178367 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178355 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="llm-d-routing-sidecar" Apr 23 19:17:06.178409 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178369 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="llm-d-routing-sidecar" Apr 23 19:17:06.178409 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178378 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" Apr 23 19:17:06.178409 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178383 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" Apr 23 19:17:06.178409 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178394 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" Apr 23 19:17:06.178409 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178399 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" Apr 23 19:17:06.178565 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178411 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="storage-initializer" Apr 23 19:17:06.178565 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178418 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="storage-initializer" Apr 23 19:17:06.178565 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178429 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="storage-initializer" Apr 23 19:17:06.178565 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178434 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="storage-initializer" Apr 23 19:17:06.178565 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178490 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="llm-d-routing-sidecar" Apr 23 19:17:06.178565 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178498 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7877468-a2da-4b90-81a0-28f69dbde278" containerName="main" Apr 23 19:17:06.178565 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.178510 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d337d849-33a6-4aee-8ee0-47e717c5abc9" containerName="main" Apr 23 19:17:06.181805 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.181779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.185364 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.185336 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4sk6\"/\"kube-root-ca.crt\"" Apr 23 19:17:06.186704 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.186682 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4sk6\"/\"openshift-service-ca.crt\"" Apr 23 19:17:06.186820 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.186704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-j4sk6\"/\"default-dockercfg-tvs57\"" Apr 23 19:17:06.197271 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.197241 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62"] Apr 23 19:17:06.298457 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.298423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxg6h\" (UniqueName: \"kubernetes.io/projected/117438cb-a1ae-48a5-977c-4a249a45e083-kube-api-access-jxg6h\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.298457 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.298459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-sys\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.298661 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.298475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-proc\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.298661 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.298502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-lib-modules\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.298661 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.298558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-podres\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.399949 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.399864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-sys\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.399949 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.399898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-proc\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.399949 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.399915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-lib-modules\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.400233 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.399968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-podres\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.400233 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.399989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-sys\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.400233 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.399995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-proc\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.400233 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.400093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-podres\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.400233 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.400115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/117438cb-a1ae-48a5-977c-4a249a45e083-lib-modules\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.400233 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.400133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxg6h\" (UniqueName: \"kubernetes.io/projected/117438cb-a1ae-48a5-977c-4a249a45e083-kube-api-access-jxg6h\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.410025 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.409987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxg6h\" (UniqueName: \"kubernetes.io/projected/117438cb-a1ae-48a5-977c-4a249a45e083-kube-api-access-jxg6h\") pod \"perf-node-gather-daemonset-72m62\" (UID: \"117438cb-a1ae-48a5-977c-4a249a45e083\") " pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.492156 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.492127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:06.582598 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.582520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-b85zj_e4ed6eb2-5d97-4853-bf31-2c8cae882d07/download-server/0.log" Apr 23 19:17:06.630671 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:06.630642 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62"] Apr 23 19:17:06.631498 ip-10-0-138-68 kubenswrapper[2576]: W0423 19:17:06.631468 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod117438cb_a1ae_48a5_977c_4a249a45e083.slice/crio-590b55715b91a24a6529da997974b7f3d685caff2511f538aa824cb6f21935a1 WatchSource:0}: Error finding container 590b55715b91a24a6529da997974b7f3d685caff2511f538aa824cb6f21935a1: Status 404 returned error can't find the container with id 590b55715b91a24a6529da997974b7f3d685caff2511f538aa824cb6f21935a1 Apr 23 19:17:07.494973 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:07.494936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" event={"ID":"117438cb-a1ae-48a5-977c-4a249a45e083","Type":"ContainerStarted","Data":"015e72c14fb9f852124d04bca1911a0a1234301318e94d8baaff3f93e4f3142d"} Apr 23 19:17:07.494973 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:07.494975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" event={"ID":"117438cb-a1ae-48a5-977c-4a249a45e083","Type":"ContainerStarted","Data":"590b55715b91a24a6529da997974b7f3d685caff2511f538aa824cb6f21935a1"} Apr 23 19:17:07.495400 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:07.495012 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:07.514852 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:07.514802 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" podStartSLOduration=1.514710056 podStartE2EDuration="1.514710056s" podCreationTimestamp="2026-04-23 19:17:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 19:17:07.51418392 +0000 UTC m=+5735.444196558" watchObservedRunningTime="2026-04-23 19:17:07.514710056 +0000 UTC m=+5735.444722693" Apr 23 19:17:07.899932 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:07.899902 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fxb9d_9fad7629-2a8f-44c8-8668-437d00f77bca/dns/0.log" Apr 23 19:17:07.924217 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:07.924170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fxb9d_9fad7629-2a8f-44c8-8668-437d00f77bca/kube-rbac-proxy/0.log" Apr 23 19:17:08.049595 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:08.049562 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v6skv_2928f4e6-28bf-471f-bb81-513b3e161d32/dns-node-resolver/0.log" Apr 23 19:17:08.619781 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:08.619750 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6f99d9d489-gqwc4_d0c42f94-a436-4dab-83fb-df3e1012e238/registry/0.log" Apr 23 19:17:08.643038 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:08.643010 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4f55z_19ad9566-830f-4ba3-bed2-db16fce5cd6a/node-ca/0.log" Apr 23 19:17:09.600059 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:09.600031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-fjvbp_def6ef66-30a7-4f7f-a2d2-1f9921020679/discovery/0.log" Apr 23 19:17:09.643611 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:09.643584 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-6b94bb86d8-j64f2_f2afacd4-04d7-4d25-aa31-0e507e733b70/istio-proxy/0.log" Apr 23 19:17:10.275454 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:10.275424 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z8pdq_1714203d-7df7-4a8f-8d58-69bc1d7062f4/serve-healthcheck-canary/0.log" Apr 23 19:17:10.835472 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:10.835444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dpxlz_93990938-3621-4020-90b7-3824b5530537/kube-rbac-proxy/0.log" Apr 23 19:17:10.858761 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:10.858728 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dpxlz_93990938-3621-4020-90b7-3824b5530537/exporter/0.log" Apr 23 19:17:10.881752 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:10.881704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dpxlz_93990938-3621-4020-90b7-3824b5530537/extractor/0.log" Apr 23 19:17:13.510239 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:13.510214 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-j4sk6/perf-node-gather-daemonset-72m62" Apr 23 19:17:14.298008 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:14.297980 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-tv9wm_25b5f9c2-2f39-43fc-b53e-66bf6bffebea/server/0.log" Apr 23 19:17:14.553545 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:14.553470 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-jcm8g_149abbbd-9ce8-4595-b1fa-9bf891e9d038/s3-init/0.log" Apr 23 19:17:14.582541 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:14.582515 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-4wkm2_dad3b43a-1c3d-4504-b2ad-00d3721005ac/seaweedfs/0.log" Apr 23 19:17:19.944053 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:19.944022 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6nd9l_c1174d10-c6be-499b-bba1-9efb0ba75fac/kube-storage-version-migrator-operator/1.log" Apr 23 19:17:19.944946 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:19.944929 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6nd9l_c1174d10-c6be-499b-bba1-9efb0ba75fac/kube-storage-version-migrator-operator/0.log" Apr 23 19:17:21.097421 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.097392 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5s6bg_91e1f83d-4f6d-434e-b876-d8ab02848d17/kube-multus-additional-cni-plugins/0.log" Apr 23 19:17:21.127425 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.127391 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5s6bg_91e1f83d-4f6d-434e-b876-d8ab02848d17/egress-router-binary-copy/0.log" Apr 23 19:17:21.155413 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.155381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5s6bg_91e1f83d-4f6d-434e-b876-d8ab02848d17/cni-plugins/0.log" Apr 23 19:17:21.188988 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.188961 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5s6bg_91e1f83d-4f6d-434e-b876-d8ab02848d17/bond-cni-plugin/0.log" Apr 23 19:17:21.214886 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.214858 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5s6bg_91e1f83d-4f6d-434e-b876-d8ab02848d17/routeoverride-cni/0.log" Apr 23 19:17:21.245181 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.245147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5s6bg_91e1f83d-4f6d-434e-b876-d8ab02848d17/whereabouts-cni-bincopy/0.log" Apr 23 19:17:21.273471 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.273444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5s6bg_91e1f83d-4f6d-434e-b876-d8ab02848d17/whereabouts-cni/0.log" Apr 23 19:17:21.710870 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.710842 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kl27l_3c193e30-8c0e-422b-be31-7daf50d7aeb1/kube-multus/0.log" Apr 23 19:17:21.881688 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.881648 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vjjmx_1b961de5-fea1-4bac-9c17-d8682d9a4242/network-metrics-daemon/0.log" Apr 23 19:17:21.903434 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:21.903404 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vjjmx_1b961de5-fea1-4bac-9c17-d8682d9a4242/kube-rbac-proxy/0.log" Apr 23 19:17:23.461892 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.461864 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-controller/0.log" Apr 23 19:17:23.480191 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.480165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/0.log" Apr 23 19:17:23.503353 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.503307 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovn-acl-logging/1.log" Apr 23 19:17:23.526198 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.526122 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/kube-rbac-proxy-node/0.log" Apr 23 19:17:23.552193 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.552163 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 19:17:23.574077 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.574049 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/northd/0.log" Apr 23 19:17:23.598573 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.598544 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/nbdb/0.log" Apr 23 19:17:23.629344 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.629320 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/sbdb/0.log" Apr 23 19:17:23.726711 ip-10-0-138-68 kubenswrapper[2576]: I0423 19:17:23.726595 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-swcqx_4b659406-d1b9-4f3d-86f2-68515038c182/ovnkube-controller/0.log"