Apr 24 21:27:55.283983 ip-10-0-134-232 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:55.675800 ip-10-0-134-232 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:55.675800 ip-10-0-134-232 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:55.675800 ip-10-0-134-232 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:55.675800 ip-10-0-134-232 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:55.675800 ip-10-0-134-232 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:55.677219 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.677120 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:55.680977 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.680956 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:55.680977 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.680973 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:55.680977 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.680977 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.680984 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.680991 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.680995 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681000 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681004 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681007 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681012 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681016 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681019 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681029 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681040 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681044 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681048 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681052 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681056 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681060 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681064 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681068 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:55.681183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681073 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681077 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681081 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681085 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681089 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681092 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681098 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681102 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681105 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681109 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681113 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681117 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681129 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681134 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681138 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681143 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681148 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681152 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681156 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681160 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:55.681947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681164 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681168 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681172 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681176 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681180 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681184 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681188 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681191 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681195 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681199 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681203 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681207 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681211 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681216 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681220 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681224 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681228 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681232 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681236 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681240 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:55.682654 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681244 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681248 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681252 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681257 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681261 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681267 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681271 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681276 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681282 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681288 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681292 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681297 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681301 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681305 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681310 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681313 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681317 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681323 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681327 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:55.683188 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681332 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681336 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681340 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681345 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681349 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681353 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681987 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.681997 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682001 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682006 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682010 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682015 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682020 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682025 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682030 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682034 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682038 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682043 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682053 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682057 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:55.683837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682063 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682067 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682072 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682076 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682080 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682084 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682089 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682093 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682097 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682102 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682106 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682114 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682119 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682124 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682128 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682132 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682137 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682141 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682145 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:55.684657 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682149 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682153 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682157 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682161 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682165 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682169 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682173 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682178 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682182 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682186 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682191 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682195 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682200 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682205 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682209 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682213 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682217 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682221 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682225 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:55.685410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682229 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682234 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682238 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682241 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682245 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682249 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682254 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682260 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682264 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682269 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682273 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682277 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682281 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682285 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682289 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682293 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682297 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682301 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682305 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:55.686086 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682309 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682314 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682318 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682324 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682328 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682332 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682338 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682347 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682351 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682355 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682359 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682363 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682367 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682371 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.682376 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683427 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683443 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683458 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683465 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683472 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683477 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683484 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:55.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683492 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683497 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683501 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683507 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683513 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683518 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683523 2568 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683528 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683532 2568 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683537 2568 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683541 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683547 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683558 2568 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683562 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683567 2568 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683572 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683578 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683585 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683598 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683603 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683609 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683613 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683618 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683623 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683628 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:55.687209 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683633 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683640 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683645 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683650 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683654 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683658 2568 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683663 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683675 2568 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683680 2568 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683684 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683689 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683694 2568 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683701 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683721 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683726 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683731 2568 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683736 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683741 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683745 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683750 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683755 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683759 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683764 2568 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683771 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683776 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:55.687893 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683782 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683797 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683803 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683809 2568 flags.go:64] FLAG: --help="false" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683834 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-134-232.ec2.internal" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683841 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683846 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683851 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683857 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683863 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683868 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683883 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683888 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683893 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683898 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683903 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683908 2568 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683929 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683935 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683941 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683945 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683951 2568 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683956 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683961 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:55.688475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683966 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683981 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683985 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683990 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.683995 2568 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684000 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684006 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684010 2568 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684015 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684021 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684035 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684042 2568 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684047 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684052 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684059 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684064 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684069 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684074 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684079 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684091 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684096 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684100 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684105 2568 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:55.689082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684110 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684119 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684124 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684129 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684134 2568 flags.go:64] FLAG: --port="10250" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684139 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684144 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d98d13a8b4220b39" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684150 2568 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684155 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684160 2568 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684164 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684169 2568 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684175 2568 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684179 2568 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684184 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684189 2568 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684195 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684200 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684205 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684210 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684222 2568 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684227 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684233 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684244 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684249 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684253 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:55.689667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684258 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684263 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684268 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684272 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684277 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684282 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684286 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684292 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684296 2568 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684301 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684311 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684316 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684321 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684331 2568 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684336 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684340 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684345 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684350 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684355 2568 flags.go:64] FLAG: --v="2" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684362 2568 flags.go:64] FLAG: --version="false" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684368 2568 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684374 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.684379 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684570 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:55.690357 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684577 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684582 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684586 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684598 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684602 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684610 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684614 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684618 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684622 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684626 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684630 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684635 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684639 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684643 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684648 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684653 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684657 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684662 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684667 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684671 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:55.690974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684676 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684680 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684684 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684689 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684693 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684697 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684718 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684723 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684727 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684731 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684736 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684740 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684744 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684748 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684752 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684756 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684768 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684774 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684779 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:55.691468 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684783 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684789 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684795 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684800 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684805 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684809 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684813 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684817 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684821 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684825 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684829 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684835 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684839 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684844 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684848 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684852 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684856 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684860 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684867 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684873 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:55.691947 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684878 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684883 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684888 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684892 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684896 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684901 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684904 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684908 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684912 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684916 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684933 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684937 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684941 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684946 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684950 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684954 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684958 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684962 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684967 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684971 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:55.692441 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684975 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684979 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684983 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684989 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684993 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.684997 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.685010 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.691436 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.691453 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691500 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691505 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691509 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691512 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691515 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691518 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:55.692974 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691521 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691523 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691526 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691529 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691532 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691535 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691537 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691540 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691543 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691545 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691548 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691550 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691553 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691556 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691558 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691561 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691563 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691566 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691569 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691572 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:55.693363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691574 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691577 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691579 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691582 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691584 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691589 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691591 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691594 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691596 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691599 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691601 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691604 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691607 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691609 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691612 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691614 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691617 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691619 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691622 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691624 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:55.693944 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691627 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691629 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691632 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691635 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691637 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691639 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691642 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691645 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691647 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691650 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691653 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691656 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691658 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691662 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691666 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691669 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691672 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691678 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691682 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:55.694444 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691685 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691688 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691691 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691694 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691697 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691700 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691715 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691718 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691721 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691723 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691726 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691729 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691731 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691734 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691737 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691739 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691742 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691744 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691747 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691750 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:55.694927 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691752 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.691758 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691853 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691857 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691860 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691863 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691866 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691869 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691871 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691874 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691877 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691880 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691883 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691886 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691889 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691892 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:55.695462 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691894 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691897 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691899 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691902 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691905 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691908 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691911 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691913 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691916 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691918 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691921 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691923 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691926 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691929 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691931 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691934 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691936 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691939 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691942 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691944 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:55.695891 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691947 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691949 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691952 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691955 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691958 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691961 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691964 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691968 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691973 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691976 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691979 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691981 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691984 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691987 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691990 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691992 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691995 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.691997 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692000 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692002 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:55.696403 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692005 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692007 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692009 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692012 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692015 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692017 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692020 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692022 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692025 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692027 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692030 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692033 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692036 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692040 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692044 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692046 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692049 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692052 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692055 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:55.696910 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692058 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692061 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692064 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692067 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692070 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692072 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692075 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692077 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692080 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692083 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692085 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692088 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:55.692090 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.692096 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.692771 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:55.697383 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.695115 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:55.697761 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.696103 2568 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:55.697761 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.696196 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:55.697761 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.696237 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:55.716168 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.716148 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:55.720871 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.720849 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:55.736016 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.736000 2568 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:55.740717 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.740692 2568 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:55.741985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.741965 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:55.748767 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.748743 2568 fs.go:135] Filesystem UUIDs: map[074dcb9d-6094-4056-ada9-48e9b8f3517a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9ee94a18-262f-4dbb-a763-afcfbf658fb6:/dev/nvme0n1p4] Apr 24 21:27:55.748830 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.748767 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:55.752950 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.752931 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:55.754277 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.754180 2568 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:55.752527758 +0000 UTC m=+0.355596531 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100938 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e7d88e85680d2fb9a29fb73cc9efb SystemUUID:ec2e7d88-e856-80d2-fb9a-29fb73cc9efb BootID:c4c125b3-613b-42e7-ae6a-26ba42ac141e Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:48:58:9b:47:6f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:48:58:9b:47:6f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:66:df:0a:07:6d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:55.754845 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.754835 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:55.754929 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.754918 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:55.755944 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.755918 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:55.756071 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.755945 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-232.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:55.756116 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.756080 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:55.756116 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.756088 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:55.756116 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.756100 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:55.756730 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.756720 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:55.757996 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.757986 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:55.758104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.758095 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:55.760068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.760059 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:55.760108 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.760077 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:55.760153 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.760113 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:55.760153 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.760130 2568 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:55.760153 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.760138 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:55.761119 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.761107 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:55.761157 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.761125 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:55.764139 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.764105 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:55.766124 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.766108 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:55.767636 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767616 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:55.767636 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767632 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767639 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767647 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767655 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767661 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767667 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767672 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767679 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767685 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767694 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:55.767737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.767717 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:55.769171 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.769157 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:55.769210 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.769172 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:55.772614 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.772588 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-232.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:55.772672 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.772620 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-232.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:55.772672 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.772639 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:55.772745 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.772731 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:55.772780 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.772762 2568 server.go:1295] "Started kubelet" Apr 24 21:27:55.772886 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.772861 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:55.772935 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.772848 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:55.772935 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.772929 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:55.773562 ip-10-0-134-232 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:55.775210 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.775196 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:55.776676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.776662 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:55.780783 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.780018 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-232.ec2.internal.18a9682e3cd4de11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-232.ec2.internal,UID:ip-10-0-134-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-232.ec2.internal,},FirstTimestamp:2026-04-24 21:27:55.772739089 +0000 UTC m=+0.375807860,LastTimestamp:2026-04-24 21:27:55.772739089 +0000 UTC m=+0.375807860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-232.ec2.internal,}" Apr 24 21:27:55.782020 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.781905 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:55.782020 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.781922 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:55.782645 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.782629 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:55.783420 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.783409 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:55.783666 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.782836 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:55.783811 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.783069 2568 factory.go:153] Registering CRI-O factory Apr 24 21:27:55.784095 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.783933 2568 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:55.784171 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.783866 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:55.784171 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.783863 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:55.784272 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.784175 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:55.784272 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.784153 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:55.784272 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.784212 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:55.784272 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.784187 2568 factory.go:55] Registering systemd factory Apr 24 21:27:55.784272 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.784241 2568 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:55.784272 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.784267 2568 factory.go:103] Registering Raw factory Apr 24 21:27:55.784605 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.784281 2568 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:55.784806 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.784786 2568 manager.go:319] Starting recovery of all containers Apr 24 21:27:55.787698 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.787668 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-232.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:55.787845 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.787823 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:55.791649 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.791490 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jrsn4" Apr 24 21:27:55.796494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.796476 2568 manager.go:324] Recovery completed Apr 24 21:27:55.800371 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.800359 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:55.802407 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.802392 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jrsn4" Apr 24 21:27:55.807527 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.807514 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:55.807586 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.807542 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:55.807586 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.807554 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:55.808038 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.808023 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:55.808038 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.808036 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:55.808121 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.808051 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:55.809569 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.809508 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-232.ec2.internal.18a9682e3ee7b2df default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-232.ec2.internal,UID:ip-10-0-134-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-232.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-232.ec2.internal,},FirstTimestamp:2026-04-24 21:27:55.807527647 +0000 UTC m=+0.410596418,LastTimestamp:2026-04-24 21:27:55.807527647 +0000 UTC m=+0.410596418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-232.ec2.internal,}" Apr 24 21:27:55.811381 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.811366 2568 policy_none.go:49] "None policy: Start" Apr 24 21:27:55.811381 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.811381 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:55.811461 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.811391 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.862211 2568 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.862235 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.862244 2568 server.go:85] "Starting device plugin registration server" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.862438 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.862453 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.862543 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.862634 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.862643 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.863149 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:55.866239 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.863212 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:55.910341 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.910317 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:55.911416 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.911393 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:55.911416 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.911417 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:55.911528 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.911432 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:55.911528 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.911438 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:55.911528 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.911465 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:55.914920 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.914904 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:55.963000 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.962948 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:55.963739 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.963723 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:55.963796 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.963764 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:55.963796 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.963778 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:55.963868 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.963803 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-232.ec2.internal" Apr 24 21:27:55.973852 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:55.973835 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-232.ec2.internal" Apr 24 21:27:55.973914 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.973857 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-232.ec2.internal\": node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:55.995512 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:55.995489 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.012415 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.012392 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal"] Apr 24 21:27:56.012495 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.012469 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:56.013192 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.013177 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:56.013261 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.013204 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:56.013261 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.013214 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:56.015489 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.015478 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:56.015641 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.015628 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.015693 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.015654 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:56.016179 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.016161 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:56.016179 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.016170 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:56.016291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.016190 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:56.016291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.016191 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:56.016291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.016210 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:56.016291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.016201 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:56.018378 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.018363 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.018466 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.018386 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:56.019233 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.019205 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:56.019233 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.019232 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:56.019352 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.019247 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:56.034223 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.034204 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-232.ec2.internal\" not found" node="ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.037851 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.037836 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-232.ec2.internal\" not found" node="ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.086835 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.086814 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/003d7feb7f049475a7b903bcca7b7224-config\") pod \"kube-apiserver-proxy-ip-10-0-134-232.ec2.internal\" (UID: \"003d7feb7f049475a7b903bcca7b7224\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.086918 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.086838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5690cd701975c873b7a6bf298ee5cb09-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal\" (UID: \"5690cd701975c873b7a6bf298ee5cb09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.086918 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.086867 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5690cd701975c873b7a6bf298ee5cb09-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal\" (UID: \"5690cd701975c873b7a6bf298ee5cb09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.096311 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.096295 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.187632 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.187606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/003d7feb7f049475a7b903bcca7b7224-config\") pod \"kube-apiserver-proxy-ip-10-0-134-232.ec2.internal\" (UID: \"003d7feb7f049475a7b903bcca7b7224\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.187690 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.187645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5690cd701975c873b7a6bf298ee5cb09-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal\" (UID: \"5690cd701975c873b7a6bf298ee5cb09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.187690 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.187669 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/003d7feb7f049475a7b903bcca7b7224-config\") pod \"kube-apiserver-proxy-ip-10-0-134-232.ec2.internal\" (UID: \"003d7feb7f049475a7b903bcca7b7224\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.187690 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.187682 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5690cd701975c873b7a6bf298ee5cb09-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal\" (UID: \"5690cd701975c873b7a6bf298ee5cb09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.187813 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.187733 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5690cd701975c873b7a6bf298ee5cb09-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal\" (UID: \"5690cd701975c873b7a6bf298ee5cb09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.187813 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.187736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5690cd701975c873b7a6bf298ee5cb09-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal\" (UID: \"5690cd701975c873b7a6bf298ee5cb09\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.196877 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.196860 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.297794 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.297771 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.335961 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.335927 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.340467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.340451 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" Apr 24 21:27:56.398422 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.398382 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.498989 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.498966 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.599548 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.599484 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.697095 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.697067 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:56.697723 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.697205 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:56.700195 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.700179 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.783299 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.783261 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:56.800826 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.800804 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.804599 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.804570 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:55 +0000 UTC" deadline="2027-09-21 07:54:46.453381376 +0000 UTC" Apr 24 21:27:56.804676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.804598 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12346h26m49.648786121s" Apr 24 21:27:56.805793 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.805780 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:56.832526 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.832496 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nhvnh" Apr 24 21:27:56.841368 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.841342 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nhvnh" Apr 24 21:27:56.901549 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:56.901482 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:56.943363 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:56.943329 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003d7feb7f049475a7b903bcca7b7224.slice/crio-855651e44a5c23e97e5cbe0673d78998075bfcd607e4145381f62b4dcea01fd6 WatchSource:0}: Error finding container 855651e44a5c23e97e5cbe0673d78998075bfcd607e4145381f62b4dcea01fd6: Status 404 returned error can't find the container with id 855651e44a5c23e97e5cbe0673d78998075bfcd607e4145381f62b4dcea01fd6 Apr 24 21:27:56.944065 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:56.944044 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5690cd701975c873b7a6bf298ee5cb09.slice/crio-6001e856d5f58d9c9d17ed5b5daeb51467d9787031c58859b23725e4208bc09f WatchSource:0}: Error finding container 6001e856d5f58d9c9d17ed5b5daeb51467d9787031c58859b23725e4208bc09f: Status 404 returned error can't find the container with id 6001e856d5f58d9c9d17ed5b5daeb51467d9787031c58859b23725e4208bc09f Apr 24 21:27:56.950520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.950199 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:56.996301 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:56.996278 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:57.001739 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:57.001691 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:57.102391 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:57.102366 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:57.203043 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:57.202962 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-232.ec2.internal\" not found" Apr 24 21:27:57.243029 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.243005 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:57.282594 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.282551 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" Apr 24 21:27:57.297886 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.297851 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:57.299491 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.299470 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" Apr 24 21:27:57.315665 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.315637 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:57.365095 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.364795 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:57.761138 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.761099 2568 apiserver.go:52] "Watching apiserver" Apr 24 21:27:57.769388 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.769361 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:57.771072 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.771044 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-fkdvz","openshift-ovn-kubernetes/ovnkube-node-c6xp2","kube-system/konnectivity-agent-jfk4d","openshift-cluster-node-tuning-operator/tuned-2h8b9","openshift-image-registry/node-ca-8fpvh","openshift-multus/multus-4pwtf","openshift-multus/multus-additional-cni-plugins-s9kdl","openshift-network-diagnostics/network-check-target-js4dn","kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal","openshift-multus/network-metrics-daemon-xrlcl"] Apr 24 21:27:57.774285 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.774262 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.776442 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.776382 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.778533 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.778511 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.780997 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.780975 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.783629 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.783610 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:27:57.785659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.785550 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:57.785659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.785571 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:57.785659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.785580 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:57.785659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.785606 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:57.786006 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.785978 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.786100 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.785559 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:57.786201 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.785731 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dcxfz\"" Apr 24 21:27:57.786261 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.785808 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zgmsz\"" Apr 24 21:27:57.788267 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.788251 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.790649 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.790603 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.793315 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.793295 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:27:57.793410 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:57.793368 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:27:57.795898 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.795672 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:57.795898 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.795681 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:57.795898 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:57.795747 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:27:57.795898 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.795760 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:57.795898 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.795834 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:57.795898 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.795875 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:57.796316 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796104 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:57.796316 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796149 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:57.796316 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796231 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:57.796494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796318 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:57.796494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796389 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:57.796494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796389 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:57.796494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796389 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:57.796494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796442 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:57.796494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796478 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s8f2n\"" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796531 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ncv4l\"" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796571 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-kubernetes\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796598 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-host\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-tuned\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796652 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796693 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-kubelet\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796738 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796757 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-slash\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796771 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796786 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-etc-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.796850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796828 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-lib-modules\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-socket-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796895 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wv7f2\"" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-iptables-alerter-script\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796952 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-host-slash\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796973 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.796977 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-log-socket\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797040 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797061 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-cni-bin\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-cni-netd\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797065 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797111 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-env-overrides\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-systemd\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.797377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797173 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-device-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.797866 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797844 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-etc-selinux\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.797926 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797883 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-systemd\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.797926 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797916 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-ovnkube-script-lib\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.798024 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/057ddaa6-b8e6-4ac0-80db-273cd674b914-agent-certs\") pod \"konnectivity-agent-jfk4d\" (UID: \"057ddaa6-b8e6-4ac0-80db-273cd674b914\") " pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:27:57.798024 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6406f30a-30e3-4227-9670-db1cb68f44b9-serviceca\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.798024 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.798010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysctl-conf\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.798163 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.798038 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-sys\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.798163 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.798063 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-sys-fs\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.798163 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.797795 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.798357 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.798496 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jb9zv\"" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.798791 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-var-lib-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.798838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.798988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t926\" (UniqueName: \"kubernetes.io/projected/6639d316-2e21-49c5-baad-539d3602282a-kube-api-access-8t926\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799020 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/057ddaa6-b8e6-4ac0-80db-273cd674b914-konnectivity-ca\") pod \"konnectivity-agent-jfk4d\" (UID: \"057ddaa6-b8e6-4ac0-80db-273cd674b914\") " pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lps4w\" (UniqueName: \"kubernetes.io/projected/8e6e9263-f0b7-4ab6-a583-6477e0156279-kube-api-access-lps4w\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799079 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-run-netns\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799111 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6639d316-2e21-49c5-baad-539d3602282a-ovn-node-metrics-cert\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799206 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6406f30a-30e3-4227-9670-db1cb68f44b9-host\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799226 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5gw7j\"" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-var-lib-kubelet\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799176 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzfhm\" (UniqueName: \"kubernetes.io/projected/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-kube-api-access-tzfhm\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799339 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-systemd-units\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.799472 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-ovn\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799492 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6l7\" (UniqueName: \"kubernetes.io/projected/6406f30a-30e3-4227-9670-db1cb68f44b9-kube-api-access-tc6l7\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-run\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799551 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2jh\" (UniqueName: \"kubernetes.io/projected/ea097fcd-42c0-4f00-9fe2-fa727f428146-kube-api-access-nw2jh\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-registration-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799613 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-run-ovn-kubernetes\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799641 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-ovnkube-config\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799671 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-modprobe-d\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysconfig\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799764 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysctl-d\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799794 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sp2qp\"" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799817 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea097fcd-42c0-4f00-9fe2-fa727f428146-tmp\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.800291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.799853 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-node-log\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.842372 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.842345 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:56 +0000 UTC" deadline="2027-11-22 16:34:39.63513564 +0000 UTC" Apr 24 21:27:57.842372 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.842370 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13843h6m41.792768319s" Apr 24 21:27:57.885520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.885495 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:57.900321 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900289 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6639d316-2e21-49c5-baad-539d3602282a-ovn-node-metrics-cert\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900412 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-hostroot\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.900412 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzfhm\" (UniqueName: \"kubernetes.io/projected/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-kube-api-access-tzfhm\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.900412 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-systemd-units\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900571 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900571 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900490 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900571 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900502 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-ovn\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900696 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6l7\" (UniqueName: \"kubernetes.io/projected/6406f30a-30e3-4227-9670-db1cb68f44b9-kube-api-access-tc6l7\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.900696 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900589 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-ovn\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900696 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900612 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-kubelet\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.900696 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900638 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-daemon-config\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.900696 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2jh\" (UniqueName: \"kubernetes.io/projected/ea097fcd-42c0-4f00-9fe2-fa727f428146-kube-api-access-nw2jh\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-registration-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900724 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-systemd-units\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900742 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900753 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-run-ovn-kubernetes\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900784 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-os-release\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900794 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-registration-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-modprobe-d\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-run-ovn-kubernetes\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysconfig\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea097fcd-42c0-4f00-9fe2-fa727f428146-tmp\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.900887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900880 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysconfig\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900895 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-node-log\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-kubelet\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-modprobe-d\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-slash\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900966 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-node-log\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900982 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-cni-netd\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901002 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-kubelet\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t926\" (UniqueName: \"kubernetes.io/projected/6639d316-2e21-49c5-baad-539d3602282a-kube-api-access-8t926\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-cni-netd\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901046 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-system-cni-dir\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901077 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901101 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.900984 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-slash\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-lib-modules\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901158 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-host-slash\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901182 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6406f30a-30e3-4227-9670-db1cb68f44b9-serviceca\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.901306 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901215 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901242 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-host-slash\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901240 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-system-cni-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901284 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-cnibin\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901308 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-lib-modules\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901320 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-multus-certs\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-systemd\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901384 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-device-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-etc-selinux\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901418 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-systemd\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-systemd\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901459 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901465 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-device-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysctl-conf\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901512 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-sys-fs\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901517 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-run-systemd\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-var-lib-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.902050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901559 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-etc-selinux\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901594 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6225c\" (UniqueName: \"kubernetes.io/projected/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-kube-api-access-6225c\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901650 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lps4w\" (UniqueName: \"kubernetes.io/projected/8e6e9263-f0b7-4ab6-a583-6477e0156279-kube-api-access-lps4w\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysctl-conf\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-run-netns\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6406f30a-30e3-4227-9670-db1cb68f44b9-serviceca\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901688 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6406f30a-30e3-4227-9670-db1cb68f44b9-host\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901725 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-os-release\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901743 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-netns\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901749 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-var-lib-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901759 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-cni-multus\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901771 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-sys-fs\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901783 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-var-lib-kubelet\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901795 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6406f30a-30e3-4227-9670-db1cb68f44b9-host\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901809 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-cni-binary-copy\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-run-netns\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.902825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-run\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901891 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-run\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-ovnkube-config\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-var-lib-kubelet\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.901970 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-cni-bin\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysctl-d\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902087 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-socket-dir-parent\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902104 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2l6\" (UniqueName: \"kubernetes.io/projected/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-kube-api-access-lt2l6\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-kubernetes\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-host\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-tuned\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902184 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902209 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-etc-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-cni-bin\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902243 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-sysctl-d\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.903591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902260 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-env-overrides\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902301 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-host\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/057ddaa6-b8e6-4ac0-80db-273cd674b914-konnectivity-ca\") pod \"konnectivity-agent-jfk4d\" (UID: \"057ddaa6-b8e6-4ac0-80db-273cd674b914\") " pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902360 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-socket-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902387 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-iptables-alerter-script\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902412 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-log-socket\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-cnibin\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902461 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-ovnkube-config\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-kubernetes\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902464 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-cni-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902571 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-env-overrides\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902592 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-conf-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902622 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-ovnkube-script-lib\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902646 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/057ddaa6-b8e6-4ac0-80db-273cd674b914-agent-certs\") pod \"konnectivity-agent-jfk4d\" (UID: \"057ddaa6-b8e6-4ac0-80db-273cd674b914\") " pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902673 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5bz2\" (UniqueName: \"kubernetes.io/projected/4608f5de-7826-4605-82e0-fc8f5d0e4830-kube-api-access-t5bz2\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-cni-binary-copy\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902765 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-etc-kubernetes\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.904332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902781 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-etc-openvswitch\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902904 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902944 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-sys\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902985 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e6e9263-f0b7-4ab6-a583-6477e0156279-socket-dir\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902972 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-k8s-cni-cncf-io\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.903027 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-host-cni-bin\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.902990 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/057ddaa6-b8e6-4ac0-80db-273cd674b914-konnectivity-ca\") pod \"konnectivity-agent-jfk4d\" (UID: \"057ddaa6-b8e6-4ac0-80db-273cd674b914\") " pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.903067 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6639d316-2e21-49c5-baad-539d3602282a-log-socket\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.903111 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea097fcd-42c0-4f00-9fe2-fa727f428146-sys\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.903276 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6639d316-2e21-49c5-baad-539d3602282a-ovnkube-script-lib\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.904024 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-iptables-alerter-script\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.904427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6639d316-2e21-49c5-baad-539d3602282a-ovn-node-metrics-cert\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.904985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.904825 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ea097fcd-42c0-4f00-9fe2-fa727f428146-tmp\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.905591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.905205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/057ddaa6-b8e6-4ac0-80db-273cd674b914-agent-certs\") pod \"konnectivity-agent-jfk4d\" (UID: \"057ddaa6-b8e6-4ac0-80db-273cd674b914\") " pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:27:57.913321 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.913163 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzfhm\" (UniqueName: \"kubernetes.io/projected/2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5-kube-api-access-tzfhm\") pod \"iptables-alerter-fkdvz\" (UID: \"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5\") " pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:57.913321 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.913188 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6l7\" (UniqueName: \"kubernetes.io/projected/6406f30a-30e3-4227-9670-db1cb68f44b9-kube-api-access-tc6l7\") pod \"node-ca-8fpvh\" (UID: \"6406f30a-30e3-4227-9670-db1cb68f44b9\") " pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:57.914452 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.914411 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lps4w\" (UniqueName: \"kubernetes.io/projected/8e6e9263-f0b7-4ab6-a583-6477e0156279-kube-api-access-lps4w\") pod \"aws-ebs-csi-driver-node-b2mfk\" (UID: \"8e6e9263-f0b7-4ab6-a583-6477e0156279\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:57.915051 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.914842 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ea097fcd-42c0-4f00-9fe2-fa727f428146-etc-tuned\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.915377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.915212 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t926\" (UniqueName: \"kubernetes.io/projected/6639d316-2e21-49c5-baad-539d3602282a-kube-api-access-8t926\") pod \"ovnkube-node-c6xp2\" (UID: \"6639d316-2e21-49c5-baad-539d3602282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:57.916500 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.916439 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" event={"ID":"5690cd701975c873b7a6bf298ee5cb09","Type":"ContainerStarted","Data":"6001e856d5f58d9c9d17ed5b5daeb51467d9787031c58859b23725e4208bc09f"} Apr 24 21:27:57.917126 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.917089 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2jh\" (UniqueName: \"kubernetes.io/projected/ea097fcd-42c0-4f00-9fe2-fa727f428146-kube-api-access-nw2jh\") pod \"tuned-2h8b9\" (UID: \"ea097fcd-42c0-4f00-9fe2-fa727f428146\") " pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:57.917515 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:57.917483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" event={"ID":"003d7feb7f049475a7b903bcca7b7224","Type":"ContainerStarted","Data":"855651e44a5c23e97e5cbe0673d78998075bfcd607e4145381f62b4dcea01fd6"} Apr 24 21:27:58.003509 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003477 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-system-cni-dir\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.003659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003516 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.003659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.003659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-system-cni-dir\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.003659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003587 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:27:58.003659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-system-cni-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.003659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-cnibin\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.003659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-multus-certs\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003688 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003733 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6225c\" (UniqueName: \"kubernetes.io/projected/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-kube-api-access-6225c\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003762 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-os-release\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-netns\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-cni-multus\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003835 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-cni-binary-copy\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003863 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-cni-bin\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.003899 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003912 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-socket-dir-parent\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.003937 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2l6\" (UniqueName: \"kubernetes.io/projected/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-kube-api-access-lt2l6\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.003961 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs podName:7ca2ae96-23c0-4771-ba4d-46f95e147eb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:58.503944515 +0000 UTC m=+3.107013275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs") pod "network-metrics-daemon-xrlcl" (UID: "7ca2ae96-23c0-4771-ba4d-46f95e147eb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004002 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-cnibin\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.004025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-cni-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004036 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-conf-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004054 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5bz2\" (UniqueName: \"kubernetes.io/projected/4608f5de-7826-4605-82e0-fc8f5d0e4830-kube-api-access-t5bz2\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004063 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004076 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-cni-binary-copy\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-etc-kubernetes\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004097 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004108 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-k8s-cni-cncf-io\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004119 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-cni-multus\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-hostroot\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004166 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-kubelet\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004176 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-multus-certs\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-os-release\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-system-cni-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-daemon-config\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004232 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-cni-bin\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-os-release\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004281 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-os-release\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.004529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-cnibin\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004340 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-cnibin\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004368 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-cni-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004383 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-var-lib-kubelet\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004384 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4608f5de-7826-4605-82e0-fc8f5d0e4830-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-conf-dir\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-netns\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004461 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4608f5de-7826-4605-82e0-fc8f5d0e4830-cni-binary-copy\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004498 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-hostroot\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004534 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-socket-dir-parent\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004632 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-host-run-k8s-cni-cncf-io\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004657 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-etc-kubernetes\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.004894 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-cni-binary-copy\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.005606 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.005358 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-multus-daemon-config\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.016324 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.016225 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:58.016324 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.016249 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:58.016324 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.016262 2568 projected.go:194] Error preparing data for projected volume kube-api-access-j9mkd for pod openshift-network-diagnostics/network-check-target-js4dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:58.016570 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.016328 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd podName:fcefb7d4-b431-40f5-a0ae-d52f3d85cf97 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:58.516311254 +0000 UTC m=+3.119380020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j9mkd" (UniqueName: "kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd") pod "network-check-target-js4dn" (UID: "fcefb7d4-b431-40f5-a0ae-d52f3d85cf97") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:58.018406 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.018386 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2l6\" (UniqueName: \"kubernetes.io/projected/05e6f705-e1bb-4e36-a24a-612ad7cf0c56-kube-api-access-lt2l6\") pod \"multus-4pwtf\" (UID: \"05e6f705-e1bb-4e36-a24a-612ad7cf0c56\") " pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.018505 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.018390 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6225c\" (UniqueName: \"kubernetes.io/projected/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-kube-api-access-6225c\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:58.019169 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.019142 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5bz2\" (UniqueName: \"kubernetes.io/projected/4608f5de-7826-4605-82e0-fc8f5d0e4830-kube-api-access-t5bz2\") pod \"multus-additional-cni-plugins-s9kdl\" (UID: \"4608f5de-7826-4605-82e0-fc8f5d0e4830\") " pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.085165 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.085138 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" Apr 24 21:27:58.099898 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.099877 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" Apr 24 21:27:58.111511 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.111490 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fkdvz" Apr 24 21:27:58.117094 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.117065 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:27:58.123970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.123950 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:27:58.130033 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.130014 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8fpvh" Apr 24 21:27:58.136199 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.136181 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4pwtf" Apr 24 21:27:58.141311 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.141291 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" Apr 24 21:27:58.169479 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.169460 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:58.506183 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.505964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:58.506183 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.506083 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:58.506268 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.506217 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs podName:7ca2ae96-23c0-4771-ba4d-46f95e147eb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:59.506201706 +0000 UTC m=+4.109270465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs") pod "network-metrics-daemon-xrlcl" (UID: "7ca2ae96-23c0-4771-ba4d-46f95e147eb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:58.520232 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:58.520139 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea097fcd_42c0_4f00_9fe2_fa727f428146.slice/crio-caa49851463eee11ab1b9d15366005f0bfcacd3d19d7611fb46fa759fda9996c WatchSource:0}: Error finding container caa49851463eee11ab1b9d15366005f0bfcacd3d19d7611fb46fa759fda9996c: Status 404 returned error can't find the container with id caa49851463eee11ab1b9d15366005f0bfcacd3d19d7611fb46fa759fda9996c Apr 24 21:27:58.522114 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:58.522094 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6e9263_f0b7_4ab6_a583_6477e0156279.slice/crio-db088c50a92ef991a9e904deaf7810d10c8a9168976e6838f371fe9583dd1756 WatchSource:0}: Error finding container db088c50a92ef991a9e904deaf7810d10c8a9168976e6838f371fe9583dd1756: Status 404 returned error can't find the container with id db088c50a92ef991a9e904deaf7810d10c8a9168976e6838f371fe9583dd1756 Apr 24 21:27:58.524631 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:58.524610 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4608f5de_7826_4605_82e0_fc8f5d0e4830.slice/crio-998e2623ef4d44f6660204bd09abd713367e770036d315d908f3227c8fb96008 WatchSource:0}: Error finding container 998e2623ef4d44f6660204bd09abd713367e770036d315d908f3227c8fb96008: Status 404 returned error can't find the container with id 998e2623ef4d44f6660204bd09abd713367e770036d315d908f3227c8fb96008 Apr 24 21:27:58.525238 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:58.525215 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6639d316_2e21_49c5_baad_539d3602282a.slice/crio-ce8171e428faf470a911dd14d01f8976c93b992b9ce451591db5641c96f23bcc WatchSource:0}: Error finding container ce8171e428faf470a911dd14d01f8976c93b992b9ce451591db5641c96f23bcc: Status 404 returned error can't find the container with id ce8171e428faf470a911dd14d01f8976c93b992b9ce451591db5641c96f23bcc Apr 24 21:27:58.526334 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:58.526303 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05e6f705_e1bb_4e36_a24a_612ad7cf0c56.slice/crio-54e87eed13ddfe928f284dca6092e6be8e2977619f68148b7c0e2ad5edefa476 WatchSource:0}: Error finding container 54e87eed13ddfe928f284dca6092e6be8e2977619f68148b7c0e2ad5edefa476: Status 404 returned error can't find the container with id 54e87eed13ddfe928f284dca6092e6be8e2977619f68148b7c0e2ad5edefa476 Apr 24 21:27:58.527903 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:58.527876 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7a70ba_5e8a_4db1_9caf_ec83d1d31fc5.slice/crio-d3587f350b302a331b5e6a2a119babdd7459ac949da971317812c04df5f99c7f WatchSource:0}: Error finding container d3587f350b302a331b5e6a2a119babdd7459ac949da971317812c04df5f99c7f: Status 404 returned error can't find the container with id d3587f350b302a331b5e6a2a119babdd7459ac949da971317812c04df5f99c7f Apr 24 21:27:58.528872 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:58.528594 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6406f30a_30e3_4227_9670_db1cb68f44b9.slice/crio-122e56f393b17f22cc09ba65fef9064a6c6852b18ef1cd58323f2338bb21b7c2 WatchSource:0}: Error finding container 122e56f393b17f22cc09ba65fef9064a6c6852b18ef1cd58323f2338bb21b7c2: Status 404 returned error can't find the container with id 122e56f393b17f22cc09ba65fef9064a6c6852b18ef1cd58323f2338bb21b7c2 Apr 24 21:27:58.529242 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:27:58.529221 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod057ddaa6_b8e6_4ac0_80db_273cd674b914.slice/crio-f686f5f9f32d899292457841c0988d4c333385aa58249323cb83e8b1b22b6d48 WatchSource:0}: Error finding container f686f5f9f32d899292457841c0988d4c333385aa58249323cb83e8b1b22b6d48: Status 404 returned error can't find the container with id f686f5f9f32d899292457841c0988d4c333385aa58249323cb83e8b1b22b6d48 Apr 24 21:27:58.607146 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.607125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:27:58.607247 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.607232 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:58.607247 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.607244 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:58.607350 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.607252 2568 projected.go:194] Error preparing data for projected volume kube-api-access-j9mkd for pod openshift-network-diagnostics/network-check-target-js4dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:58.607350 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:58.607293 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd podName:fcefb7d4-b431-40f5-a0ae-d52f3d85cf97 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:59.607280461 +0000 UTC m=+4.210349220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-j9mkd" (UniqueName: "kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd") pod "network-check-target-js4dn" (UID: "fcefb7d4-b431-40f5-a0ae-d52f3d85cf97") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:58.842582 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.842473 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:56 +0000 UTC" deadline="2027-10-02 16:47:42.607018493 +0000 UTC" Apr 24 21:27:58.842582 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.842505 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12619h19m43.76451639s" Apr 24 21:27:58.931304 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.931252 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" event={"ID":"003d7feb7f049475a7b903bcca7b7224","Type":"ContainerStarted","Data":"c8feec258974019b5f0c4491ecf8b9361f40f4e412913edf9504474bb047c623"} Apr 24 21:27:58.933359 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.933295 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8fpvh" event={"ID":"6406f30a-30e3-4227-9670-db1cb68f44b9","Type":"ContainerStarted","Data":"122e56f393b17f22cc09ba65fef9064a6c6852b18ef1cd58323f2338bb21b7c2"} Apr 24 21:27:58.943872 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.943843 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fkdvz" event={"ID":"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5","Type":"ContainerStarted","Data":"d3587f350b302a331b5e6a2a119babdd7459ac949da971317812c04df5f99c7f"} Apr 24 21:27:58.946290 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.946231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pwtf" event={"ID":"05e6f705-e1bb-4e36-a24a-612ad7cf0c56","Type":"ContainerStarted","Data":"54e87eed13ddfe928f284dca6092e6be8e2977619f68148b7c0e2ad5edefa476"} Apr 24 21:27:58.951820 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.948508 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-232.ec2.internal" podStartSLOduration=1.948494202 podStartE2EDuration="1.948494202s" podCreationTimestamp="2026-04-24 21:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:58.94765772 +0000 UTC m=+3.550726502" watchObservedRunningTime="2026-04-24 21:27:58.948494202 +0000 UTC m=+3.551562984" Apr 24 21:27:58.954637 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.952337 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" event={"ID":"4608f5de-7826-4605-82e0-fc8f5d0e4830","Type":"ContainerStarted","Data":"998e2623ef4d44f6660204bd09abd713367e770036d315d908f3227c8fb96008"} Apr 24 21:27:58.960267 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.960141 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" event={"ID":"8e6e9263-f0b7-4ab6-a583-6477e0156279","Type":"ContainerStarted","Data":"db088c50a92ef991a9e904deaf7810d10c8a9168976e6838f371fe9583dd1756"} Apr 24 21:27:58.966868 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.966822 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" event={"ID":"ea097fcd-42c0-4f00-9fe2-fa727f428146","Type":"ContainerStarted","Data":"caa49851463eee11ab1b9d15366005f0bfcacd3d19d7611fb46fa759fda9996c"} Apr 24 21:27:58.972835 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.972634 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jfk4d" event={"ID":"057ddaa6-b8e6-4ac0-80db-273cd674b914","Type":"ContainerStarted","Data":"f686f5f9f32d899292457841c0988d4c333385aa58249323cb83e8b1b22b6d48"} Apr 24 21:27:58.975968 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:58.975924 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"ce8171e428faf470a911dd14d01f8976c93b992b9ce451591db5641c96f23bcc"} Apr 24 21:27:59.515386 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:59.514688 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:59.515386 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:59.514899 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:59.515386 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:59.514965 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs podName:7ca2ae96-23c0-4771-ba4d-46f95e147eb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.514946586 +0000 UTC m=+6.118015350 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs") pod "network-metrics-daemon-xrlcl" (UID: "7ca2ae96-23c0-4771-ba4d-46f95e147eb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:59.615478 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:59.615424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:27:59.615650 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:59.615612 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:59.615650 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:59.615633 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:59.615650 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:59.615645 2568 projected.go:194] Error preparing data for projected volume kube-api-access-j9mkd for pod openshift-network-diagnostics/network-check-target-js4dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:59.615908 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:59.615700 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd podName:fcefb7d4-b431-40f5-a0ae-d52f3d85cf97 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.615682718 +0000 UTC m=+6.218751480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-j9mkd" (UniqueName: "kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd") pod "network-check-target-js4dn" (UID: "fcefb7d4-b431-40f5-a0ae-d52f3d85cf97") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:59.912976 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:59.912896 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:27:59.913466 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:59.913046 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:27:59.914872 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:59.914848 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:27:59.915001 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:27:59.914951 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:27:59.983530 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:59.983251 2568 generic.go:358] "Generic (PLEG): container finished" podID="5690cd701975c873b7a6bf298ee5cb09" containerID="fda2d7e41b11ba5a98df65b9dd4069cf320da5b2a9699756063924ceea4eb087" exitCode=0 Apr 24 21:27:59.984480 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:27:59.984210 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" event={"ID":"5690cd701975c873b7a6bf298ee5cb09","Type":"ContainerDied","Data":"fda2d7e41b11ba5a98df65b9dd4069cf320da5b2a9699756063924ceea4eb087"} Apr 24 21:28:00.989629 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:00.989544 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" event={"ID":"5690cd701975c873b7a6bf298ee5cb09","Type":"ContainerStarted","Data":"017af41cc6a257927ea870c951f37c938ca4b5fa9a2ca949268dd99e123fbb7a"} Apr 24 21:28:01.530463 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.530421 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:01.530659 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:01.530586 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:01.530659 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:01.530656 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs podName:7ca2ae96-23c0-4771-ba4d-46f95e147eb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:05.530636994 +0000 UTC m=+10.133705758 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs") pod "network-metrics-daemon-xrlcl" (UID: "7ca2ae96-23c0-4771-ba4d-46f95e147eb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:01.628899 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.628844 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-232.ec2.internal" podStartSLOduration=4.628823069 podStartE2EDuration="4.628823069s" podCreationTimestamp="2026-04-24 21:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:01.014530439 +0000 UTC m=+5.617599229" watchObservedRunningTime="2026-04-24 21:28:01.628823069 +0000 UTC m=+6.231891852" Apr 24 21:28:01.629461 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.629440 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9v87r"] Apr 24 21:28:01.631514 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.631484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:01.631672 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:01.631656 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:01.631753 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:01.631679 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:01.631753 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:01.631692 2568 projected.go:194] Error preparing data for projected volume kube-api-access-j9mkd for pod openshift-network-diagnostics/network-check-target-js4dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:01.631861 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:01.631758 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd podName:fcefb7d4-b431-40f5-a0ae-d52f3d85cf97 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:05.631742919 +0000 UTC m=+10.234811680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-j9mkd" (UniqueName: "kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd") pod "network-check-target-js4dn" (UID: "fcefb7d4-b431-40f5-a0ae-d52f3d85cf97") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:01.632291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.632273 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.635372 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.635350 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wwmb2\"" Apr 24 21:28:01.635667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.635650 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.635754 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.635652 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.732658 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.732622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqdt\" (UniqueName: \"kubernetes.io/projected/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-kube-api-access-6kqdt\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.732836 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.732682 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-tmp-dir\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.732836 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.732723 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-hosts-file\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.834031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.833349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqdt\" (UniqueName: \"kubernetes.io/projected/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-kube-api-access-6kqdt\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.834031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.833412 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-tmp-dir\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.834031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.833511 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-hosts-file\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.834031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.833668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-hosts-file\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.834321 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.834244 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-tmp-dir\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.850679 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.850363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqdt\" (UniqueName: \"kubernetes.io/projected/66b6cbbe-44e0-47d7-8578-ce2ea2980a91-kube-api-access-6kqdt\") pod \"node-resolver-9v87r\" (UID: \"66b6cbbe-44e0-47d7-8578-ce2ea2980a91\") " pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:01.913243 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.913177 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:01.913434 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:01.913335 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:01.913859 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.913836 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:01.913970 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:01.913946 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:01.945933 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:01.945898 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9v87r" Apr 24 21:28:03.912157 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:03.911666 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:03.912157 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:03.911805 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:03.912157 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:03.911666 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:03.912157 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:03.912107 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:05.563008 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:05.562965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:05.563467 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:05.563146 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:05.563467 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:05.563231 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs podName:7ca2ae96-23c0-4771-ba4d-46f95e147eb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:13.563209831 +0000 UTC m=+18.166278603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs") pod "network-metrics-daemon-xrlcl" (UID: "7ca2ae96-23c0-4771-ba4d-46f95e147eb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:05.664152 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:05.664040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:05.664313 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:05.664242 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:05.664313 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:05.664267 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:05.664313 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:05.664281 2568 projected.go:194] Error preparing data for projected volume kube-api-access-j9mkd for pod openshift-network-diagnostics/network-check-target-js4dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:05.664483 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:05.664335 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd podName:fcefb7d4-b431-40f5-a0ae-d52f3d85cf97 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:13.664317521 +0000 UTC m=+18.267386285 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-j9mkd" (UniqueName: "kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd") pod "network-check-target-js4dn" (UID: "fcefb7d4-b431-40f5-a0ae-d52f3d85cf97") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:05.912832 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:05.912794 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:05.913001 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:05.912904 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:05.913429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:05.913274 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:05.913429 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:05.913385 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:07.912559 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:07.912522 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:07.912559 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:07.912549 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:07.913047 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:07.912672 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:07.913047 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:07.912808 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:09.911945 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:09.911910 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:09.912378 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:09.911910 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:09.912378 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:09.912021 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:09.912378 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:09.912089 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:11.912536 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:11.912475 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:11.912991 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:11.912606 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:11.912991 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:11.912666 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:11.912991 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:11.912807 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:13.624760 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:13.624722 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:13.625280 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:13.624857 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:13.625280 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:13.624934 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs podName:7ca2ae96-23c0-4771-ba4d-46f95e147eb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.62491319 +0000 UTC m=+34.227981949 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs") pod "network-metrics-daemon-xrlcl" (UID: "7ca2ae96-23c0-4771-ba4d-46f95e147eb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:13.725124 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:13.725093 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:13.725264 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:13.725247 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:13.725303 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:13.725267 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:13.725303 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:13.725277 2568 projected.go:194] Error preparing data for projected volume kube-api-access-j9mkd for pod openshift-network-diagnostics/network-check-target-js4dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:13.725370 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:13.725321 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd podName:fcefb7d4-b431-40f5-a0ae-d52f3d85cf97 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.725309218 +0000 UTC m=+34.328377976 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-j9mkd" (UniqueName: "kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd") pod "network-check-target-js4dn" (UID: "fcefb7d4-b431-40f5-a0ae-d52f3d85cf97") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:13.911700 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:13.911612 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:13.911880 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:13.911634 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:13.911880 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:13.911770 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:13.911880 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:13.911830 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:15.912807 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:15.912383 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:15.913538 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:15.912474 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:15.913538 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:15.912953 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:15.913538 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:15.913004 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:16.014426 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.014391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" event={"ID":"ea097fcd-42c0-4f00-9fe2-fa727f428146","Type":"ContainerStarted","Data":"4ba2dc13858973e1908897c89fc8f4fc24fb2eb981060d1dcd71cd4770fada75"} Apr 24 21:28:16.015751 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.015722 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jfk4d" event={"ID":"057ddaa6-b8e6-4ac0-80db-273cd674b914","Type":"ContainerStarted","Data":"114d2ced58371c490d86683c23bb0b8407cb0ca38647d528af5241d97673604f"} Apr 24 21:28:16.018558 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.018539 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:28:16.022251 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.022216 2568 generic.go:358] "Generic (PLEG): container finished" podID="6639d316-2e21-49c5-baad-539d3602282a" containerID="9310e9776fe2f1555a10b2822827c1ed4f48971a848420e54023317907839e76" exitCode=1 Apr 24 21:28:16.022339 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.022313 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"10fd5137031d2054bcc3c46ea69f3c54697891fb1e25a7f6ff14cfe4edcc5c99"} Apr 24 21:28:16.022402 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.022350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"eb8a02cf6d54471625e374dde77c438f48e513b5ae466c2001760bf74048cda5"} Apr 24 21:28:16.022402 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.022365 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"15fb66b1cf2b6a532457ac7cb4e6980e79ace2f74063867eff4d90922ae919eb"} Apr 24 21:28:16.022402 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.022377 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"dc205f5aa3378f43e9233bd6844ec5231a56cab5b16f49f8b2a18c0c1b123a38"} Apr 24 21:28:16.022402 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.022390 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerDied","Data":"9310e9776fe2f1555a10b2822827c1ed4f48971a848420e54023317907839e76"} Apr 24 21:28:16.022575 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.022403 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"5d8bc06ec9ced27a4edeebbd4cc810be52fcad99ad2f969bccbc66f3ae00a189"} Apr 24 21:28:16.023690 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.023584 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9v87r" event={"ID":"66b6cbbe-44e0-47d7-8578-ce2ea2980a91","Type":"ContainerStarted","Data":"3a0c51212ed517a458e6fd2c95a7aec757c3c78a0c57787753c0bc9a9672d56e"} Apr 24 21:28:16.023690 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.023619 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9v87r" event={"ID":"66b6cbbe-44e0-47d7-8578-ce2ea2980a91","Type":"ContainerStarted","Data":"7067736f53b0cf47aa67a31ea7cda49f8d4ea8ca3e99af99fe678f19e72777df"} Apr 24 21:28:16.025033 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.025000 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8fpvh" event={"ID":"6406f30a-30e3-4227-9670-db1cb68f44b9","Type":"ContainerStarted","Data":"3565eb7fe6bcb36eef934bf67ab0ebe1464fd06128b9a8d329e76cf802bff56e"} Apr 24 21:28:16.026342 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.026312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4pwtf" event={"ID":"05e6f705-e1bb-4e36-a24a-612ad7cf0c56","Type":"ContainerStarted","Data":"05f8ef63049b2edc9684dfda0d053ad287df43c7e4f60256c51a5d8a3bf721e6"} Apr 24 21:28:16.027747 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.027726 2568 generic.go:358] "Generic (PLEG): container finished" podID="4608f5de-7826-4605-82e0-fc8f5d0e4830" containerID="bfc0803053c326c771e7a801307fb380e8f33ac49dba8f95e7aaa8610c8cb1d1" exitCode=0 Apr 24 21:28:16.027834 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.027792 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" event={"ID":"4608f5de-7826-4605-82e0-fc8f5d0e4830","Type":"ContainerDied","Data":"bfc0803053c326c771e7a801307fb380e8f33ac49dba8f95e7aaa8610c8cb1d1"} Apr 24 21:28:16.029139 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.029117 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" event={"ID":"8e6e9263-f0b7-4ab6-a583-6477e0156279","Type":"ContainerStarted","Data":"0d305ed93b3f9eac3e8cf7075a5d0ed7c8eda14c16f9eb642cf8d87018b767e0"} Apr 24 21:28:16.037636 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.037591 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2h8b9" podStartSLOduration=4.319655791 podStartE2EDuration="21.03757828s" podCreationTimestamp="2026-04-24 21:27:55 +0000 UTC" firstStartedPulling="2026-04-24 21:27:58.522179451 +0000 UTC m=+3.125248224" lastFinishedPulling="2026-04-24 21:28:15.240101939 +0000 UTC m=+19.843170713" observedRunningTime="2026-04-24 21:28:16.036526024 +0000 UTC m=+20.639594809" watchObservedRunningTime="2026-04-24 21:28:16.03757828 +0000 UTC m=+20.640647063" Apr 24 21:28:16.056465 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.056419 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jfk4d" podStartSLOduration=11.49146096 podStartE2EDuration="20.056407977s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:27:58.53121976 +0000 UTC m=+3.134288525" lastFinishedPulling="2026-04-24 21:28:07.096166768 +0000 UTC m=+11.699235542" observedRunningTime="2026-04-24 21:28:16.055868525 +0000 UTC m=+20.658937307" watchObservedRunningTime="2026-04-24 21:28:16.056407977 +0000 UTC m=+20.659476758" Apr 24 21:28:16.074431 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.074393 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8fpvh" podStartSLOduration=3.400058459 podStartE2EDuration="20.074381936s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:27:58.530805065 +0000 UTC m=+3.133873842" lastFinishedPulling="2026-04-24 21:28:15.205128544 +0000 UTC m=+19.808197319" observedRunningTime="2026-04-24 21:28:16.074244211 +0000 UTC m=+20.677312992" watchObservedRunningTime="2026-04-24 21:28:16.074381936 +0000 UTC m=+20.677450716" Apr 24 21:28:16.133181 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.133114 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4pwtf" podStartSLOduration=3.288365023 podStartE2EDuration="20.133099493s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:27:58.52809832 +0000 UTC m=+3.131167092" lastFinishedPulling="2026-04-24 21:28:15.372832786 +0000 UTC m=+19.975901562" observedRunningTime="2026-04-24 21:28:16.132742823 +0000 UTC m=+20.735811605" watchObservedRunningTime="2026-04-24 21:28:16.133099493 +0000 UTC m=+20.736168294" Apr 24 21:28:16.150508 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.150468 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9v87r" podStartSLOduration=15.150456236 podStartE2EDuration="15.150456236s" podCreationTimestamp="2026-04-24 21:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:16.150416403 +0000 UTC m=+20.753485185" watchObservedRunningTime="2026-04-24 21:28:16.150456236 +0000 UTC m=+20.753525016" Apr 24 21:28:16.455070 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.454932 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:28:16.727900 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.727864 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:28:16.728599 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.728576 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:28:16.873686 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.873527 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:28:16.45506642Z","UUID":"fce645a3-c96f-4ff7-8e15-7f550f0b39da","Handler":null,"Name":"","Endpoint":""} Apr 24 21:28:16.876305 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.876272 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:28:16.876305 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.876302 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:28:16.996826 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:16.996798 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9v87r_66b6cbbe-44e0-47d7-8578-ce2ea2980a91/dns-node-resolver/0.log" Apr 24 21:28:17.032725 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:17.032675 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fkdvz" event={"ID":"2f7a70ba-5e8a-4db1-9caf-ec83d1d31fc5","Type":"ContainerStarted","Data":"bc24a8823ef40d510a39d97b6b5c0c70bd064716188a22ecf12a0a8c443412e5"} Apr 24 21:28:17.034920 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:17.034891 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" event={"ID":"8e6e9263-f0b7-4ab6-a583-6477e0156279","Type":"ContainerStarted","Data":"6f70d757229cf9c3b9b22e2f9eb252b4d13593bb04072e5539cb40e10751d436"} Apr 24 21:28:17.912126 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:17.912095 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:17.912233 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:17.912212 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:17.912285 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:17.912244 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:17.912335 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:17.912317 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:17.978315 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:17.978294 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8fpvh_6406f30a-30e3-4227-9670-db1cb68f44b9/node-ca/0.log" Apr 24 21:28:18.040011 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:18.039984 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:28:18.040576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:18.040432 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"23a18920596a1b78ea4fc3d982a5e3186827e09b5326927bb21e4cd3e8a73fe4"} Apr 24 21:28:18.042765 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:18.042738 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" event={"ID":"8e6e9263-f0b7-4ab6-a583-6477e0156279","Type":"ContainerStarted","Data":"1f03f019a78322ab964dae32c8a3323554d28edf077980a5bf8c533b843bf6f0"} Apr 24 21:28:18.042872 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:18.042827 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:28:18.064666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:18.064619 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b2mfk" podStartSLOduration=4.362099727 podStartE2EDuration="23.064606182s" podCreationTimestamp="2026-04-24 21:27:55 +0000 UTC" firstStartedPulling="2026-04-24 21:27:58.524019942 +0000 UTC m=+3.127088715" lastFinishedPulling="2026-04-24 21:28:17.226526406 +0000 UTC m=+21.829595170" observedRunningTime="2026-04-24 21:28:18.064281517 +0000 UTC m=+22.667350302" watchObservedRunningTime="2026-04-24 21:28:18.064606182 +0000 UTC m=+22.667674962" Apr 24 21:28:18.064835 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:18.064799 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fkdvz" podStartSLOduration=5.389542722 podStartE2EDuration="22.064792314s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:27:58.530152814 +0000 UTC m=+3.133221573" lastFinishedPulling="2026-04-24 21:28:15.205402393 +0000 UTC m=+19.808471165" observedRunningTime="2026-04-24 21:28:17.04984687 +0000 UTC m=+21.652915655" watchObservedRunningTime="2026-04-24 21:28:18.064792314 +0000 UTC m=+22.667861100" Apr 24 21:28:19.912047 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:19.912011 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:19.912621 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:19.912063 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:19.912621 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:19.912152 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:19.912621 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:19.912270 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:21.050119 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.049867 2568 generic.go:358] "Generic (PLEG): container finished" podID="4608f5de-7826-4605-82e0-fc8f5d0e4830" containerID="a4e8a5044e1fbc1e98509a0c2df9a2e44a9d8b25a53b87a7d1144537a45743b2" exitCode=0 Apr 24 21:28:21.050527 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.049944 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" event={"ID":"4608f5de-7826-4605-82e0-fc8f5d0e4830","Type":"ContainerDied","Data":"a4e8a5044e1fbc1e98509a0c2df9a2e44a9d8b25a53b87a7d1144537a45743b2"} Apr 24 21:28:21.054665 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.054648 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:28:21.055101 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.055082 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"fd63bc5af989a1f3aaabffb23d5cc86af24037bad6b67c6f4a6d02ff6f1d1d2a"} Apr 24 21:28:21.055429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.055405 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:28:21.055598 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.055583 2568 scope.go:117] "RemoveContainer" containerID="9310e9776fe2f1555a10b2822827c1ed4f48971a848420e54023317907839e76" Apr 24 21:28:21.070501 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.070484 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:28:21.292554 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.292528 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:28:21.912152 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.912122 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:21.912152 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:21.912138 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:21.912342 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:21.912222 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:21.912342 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:21.912273 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:22.058957 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:22.058929 2568 generic.go:358] "Generic (PLEG): container finished" podID="4608f5de-7826-4605-82e0-fc8f5d0e4830" containerID="9357e11caa5cec4316a2c33c1bad2735dddbc80e4193cca1f2c7ff55f5959ad0" exitCode=0 Apr 24 21:28:22.059321 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:22.059010 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" event={"ID":"4608f5de-7826-4605-82e0-fc8f5d0e4830","Type":"ContainerDied","Data":"9357e11caa5cec4316a2c33c1bad2735dddbc80e4193cca1f2c7ff55f5959ad0"} Apr 24 21:28:22.062749 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:22.062731 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:28:22.063104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:22.063076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" event={"ID":"6639d316-2e21-49c5-baad-539d3602282a","Type":"ContainerStarted","Data":"7f598b6c206b2526ae52f96a8bcaae9e0ccd0208a85f4f543d1e9e02cdffb141"} Apr 24 21:28:22.063329 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:22.063312 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:28:22.077772 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:22.077748 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:28:22.108377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:22.108336 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" podStartSLOduration=9.335424105 podStartE2EDuration="26.108325492s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:27:58.527221713 +0000 UTC m=+3.130290472" lastFinishedPulling="2026-04-24 21:28:15.300123086 +0000 UTC m=+19.903191859" observedRunningTime="2026-04-24 21:28:22.107765206 +0000 UTC m=+26.710833986" watchObservedRunningTime="2026-04-24 21:28:22.108325492 +0000 UTC m=+26.711394274" Apr 24 21:28:23.066943 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:23.066917 2568 generic.go:358] "Generic (PLEG): container finished" podID="4608f5de-7826-4605-82e0-fc8f5d0e4830" containerID="8cd53c8c89241f67378abd36ec1e41edf0f0792bee323b55867ffc7970ae4496" exitCode=0 Apr 24 21:28:23.067319 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:23.066997 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" event={"ID":"4608f5de-7826-4605-82e0-fc8f5d0e4830","Type":"ContainerDied","Data":"8cd53c8c89241f67378abd36ec1e41edf0f0792bee323b55867ffc7970ae4496"} Apr 24 21:28:23.912268 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:23.912228 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:23.912268 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:23.912264 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:23.912490 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:23.912360 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:23.912536 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:23.912512 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:25.724791 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:25.724534 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:28:25.725266 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:25.724919 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:28:25.725266 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:25.725195 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jfk4d" Apr 24 21:28:25.912959 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:25.912929 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:25.913129 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:25.913024 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:25.913129 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:25.913118 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:25.913263 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:25.913239 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:27.911609 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:27.911578 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:27.912113 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:27.911720 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:27.912113 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:27.911767 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:27.912113 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:27.911892 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:29.651166 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:29.651132 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:29.651668 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:29.651244 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:29.651668 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:29.651292 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs podName:7ca2ae96-23c0-4771-ba4d-46f95e147eb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.651278179 +0000 UTC m=+66.254346938 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs") pod "network-metrics-daemon-xrlcl" (UID: "7ca2ae96-23c0-4771-ba4d-46f95e147eb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:29.751901 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:29.751871 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:29.752024 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:29.752006 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:29.752024 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:29.752019 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:29.752098 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:29.752027 2568 projected.go:194] Error preparing data for projected volume kube-api-access-j9mkd for pod openshift-network-diagnostics/network-check-target-js4dn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:29.752098 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:29.752073 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd podName:fcefb7d4-b431-40f5-a0ae-d52f3d85cf97 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.752059565 +0000 UTC m=+66.355128323 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-j9mkd" (UniqueName: "kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd") pod "network-check-target-js4dn" (UID: "fcefb7d4-b431-40f5-a0ae-d52f3d85cf97") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:29.912626 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:29.912544 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:29.912802 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:29.912545 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:29.912802 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:29.912630 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:29.912802 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:29.912743 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:30.080476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:30.080442 2568 generic.go:358] "Generic (PLEG): container finished" podID="4608f5de-7826-4605-82e0-fc8f5d0e4830" containerID="d9e41f7f1e75c8c9a85408ccfa957d0b9ddf294cb3ead63b4e8fd9294c1a9e5b" exitCode=0 Apr 24 21:28:30.080476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:30.080479 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" event={"ID":"4608f5de-7826-4605-82e0-fc8f5d0e4830","Type":"ContainerDied","Data":"d9e41f7f1e75c8c9a85408ccfa957d0b9ddf294cb3ead63b4e8fd9294c1a9e5b"} Apr 24 21:28:31.085290 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:31.085257 2568 generic.go:358] "Generic (PLEG): container finished" podID="4608f5de-7826-4605-82e0-fc8f5d0e4830" containerID="b4a205498c65ada5fb6a5efbfe52881d5a10c1b0d82a645b83c994445c775293" exitCode=0 Apr 24 21:28:31.085722 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:31.085311 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" event={"ID":"4608f5de-7826-4605-82e0-fc8f5d0e4830","Type":"ContainerDied","Data":"b4a205498c65ada5fb6a5efbfe52881d5a10c1b0d82a645b83c994445c775293"} Apr 24 21:28:31.911846 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:31.911814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:31.912036 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:31.911816 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:31.912036 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:31.911911 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:31.912036 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:31.911982 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:32.089337 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:32.089306 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" event={"ID":"4608f5de-7826-4605-82e0-fc8f5d0e4830","Type":"ContainerStarted","Data":"edc83d4dcbcdd2054d2ce5808863b3207073036504805b3ce9ade434f06b981c"} Apr 24 21:28:32.115312 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:32.115265 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s9kdl" podStartSLOduration=5.687367399 podStartE2EDuration="36.115253748s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:27:58.526195864 +0000 UTC m=+3.129264638" lastFinishedPulling="2026-04-24 21:28:28.954082228 +0000 UTC m=+33.557150987" observedRunningTime="2026-04-24 21:28:32.115073865 +0000 UTC m=+36.718142656" watchObservedRunningTime="2026-04-24 21:28:32.115253748 +0000 UTC m=+36.718322528" Apr 24 21:28:33.911909 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:33.911872 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:33.912299 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:33.911873 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:33.912299 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:33.911973 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:33.912299 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:33.912063 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:35.913253 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:35.913215 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:35.913831 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:35.913354 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:35.913831 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:35.913350 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:35.913831 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:35.913450 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:37.912687 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:37.912649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:37.912687 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:37.912683 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:37.913176 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:37.912771 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:37.913176 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:37.912826 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:39.218853 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:39.218825 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-js4dn"] Apr 24 21:28:39.219423 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:39.218931 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:39.219423 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:39.219042 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:39.221850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:39.221825 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xrlcl"] Apr 24 21:28:39.221953 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:39.221925 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:39.222061 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:39.222035 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:40.911835 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:40.911801 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:40.912305 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:40.911802 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:40.912305 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:40.911901 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:40.912305 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:40.911977 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:42.911929 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:42.911903 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:42.912286 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:42.911903 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:42.912286 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:42.912055 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xrlcl" podUID="7ca2ae96-23c0-4771-ba4d-46f95e147eb7" Apr 24 21:28:42.912286 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:42.912098 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-js4dn" podUID="fcefb7d4-b431-40f5-a0ae-d52f3d85cf97" Apr 24 21:28:43.266922 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.266895 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-232.ec2.internal" event="NodeReady" Apr 24 21:28:43.267071 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.267017 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:43.323069 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.323037 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vqxb9"] Apr 24 21:28:43.335067 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.335042 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rbrfl"] Apr 24 21:28:43.335233 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.335216 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.337859 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.337836 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:28:43.339047 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.338695 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:28:43.339047 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.338693 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:28:43.339047 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.338915 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qk948\"" Apr 24 21:28:43.339269 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.339149 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:28:43.347922 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.347902 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vqxb9"] Apr 24 21:28:43.347989 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.347931 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rbrfl"] Apr 24 21:28:43.348044 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.348032 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.350403 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.350386 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:43.350765 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.350745 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xsnkf\"" Apr 24 21:28:43.350869 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.350779 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:43.353475 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxd6z\" (UniqueName: \"kubernetes.io/projected/7fe12bc3-3098-4d2a-bf02-8982253438e3-kube-api-access-gxd6z\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.353566 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353486 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fe12bc3-3098-4d2a-bf02-8982253438e3-tmp-dir\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.353566 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353514 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fe12bc3-3098-4d2a-bf02-8982253438e3-metrics-tls\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.353566 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rnj\" (UniqueName: \"kubernetes.io/projected/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-kube-api-access-v4rnj\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.353679 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353592 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.353679 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353667 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-crio-socket\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.353780 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353691 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.353780 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-data-volume\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.353780 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.353762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fe12bc3-3098-4d2a-bf02-8982253438e3-config-volume\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.426955 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.426924 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-whtpb"] Apr 24 21:28:43.436922 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.436905 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-whtpb" Apr 24 21:28:43.441039 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.441018 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:43.441533 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.441508 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-whtpb"] Apr 24 21:28:43.442414 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.442393 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:43.442529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.442461 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:43.442529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.442491 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ssv7d\"" Apr 24 21:28:43.455022 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455003 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-data-volume\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.455109 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455028 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fe12bc3-3098-4d2a-bf02-8982253438e3-config-volume\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.455109 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxd6z\" (UniqueName: \"kubernetes.io/projected/7fe12bc3-3098-4d2a-bf02-8982253438e3-kube-api-access-gxd6z\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.455109 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455072 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fe12bc3-3098-4d2a-bf02-8982253438e3-tmp-dir\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.455109 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fe12bc3-3098-4d2a-bf02-8982253438e3-metrics-tls\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.455283 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npr4b\" (UniqueName: \"kubernetes.io/projected/b82dd24e-152b-4750-951d-1506b5854df1-kube-api-access-npr4b\") pod \"ingress-canary-whtpb\" (UID: \"b82dd24e-152b-4750-951d-1506b5854df1\") " pod="openshift-ingress-canary/ingress-canary-whtpb" Apr 24 21:28:43.455332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rnj\" (UniqueName: \"kubernetes.io/projected/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-kube-api-access-v4rnj\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.455332 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.455396 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455354 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82dd24e-152b-4750-951d-1506b5854df1-cert\") pod \"ingress-canary-whtpb\" (UID: \"b82dd24e-152b-4750-951d-1506b5854df1\") " pod="openshift-ingress-canary/ingress-canary-whtpb" Apr 24 21:28:43.455446 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-crio-socket\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.455480 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455464 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.455644 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455625 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-crio-socket\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.455839 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.455824 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.458734 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.458693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-data-volume\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.459160 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.459142 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.461980 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.461958 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fe12bc3-3098-4d2a-bf02-8982253438e3-tmp-dir\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.462087 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.462035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7fe12bc3-3098-4d2a-bf02-8982253438e3-config-volume\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.462150 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.462098 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7fe12bc3-3098-4d2a-bf02-8982253438e3-metrics-tls\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.464757 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.464737 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxd6z\" (UniqueName: \"kubernetes.io/projected/7fe12bc3-3098-4d2a-bf02-8982253438e3-kube-api-access-gxd6z\") pod \"dns-default-rbrfl\" (UID: \"7fe12bc3-3098-4d2a-bf02-8982253438e3\") " pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.466286 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.466265 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rnj\" (UniqueName: \"kubernetes.io/projected/2bdbfe10-ce76-4858-b4e6-d9443744ee5d-kube-api-access-v4rnj\") pod \"insights-runtime-extractor-vqxb9\" (UID: \"2bdbfe10-ce76-4858-b4e6-d9443744ee5d\") " pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.556521 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.556468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npr4b\" (UniqueName: \"kubernetes.io/projected/b82dd24e-152b-4750-951d-1506b5854df1-kube-api-access-npr4b\") pod \"ingress-canary-whtpb\" (UID: \"b82dd24e-152b-4750-951d-1506b5854df1\") " pod="openshift-ingress-canary/ingress-canary-whtpb" Apr 24 21:28:43.556521 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.556498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82dd24e-152b-4750-951d-1506b5854df1-cert\") pod \"ingress-canary-whtpb\" (UID: \"b82dd24e-152b-4750-951d-1506b5854df1\") " pod="openshift-ingress-canary/ingress-canary-whtpb" Apr 24 21:28:43.558559 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.558542 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b82dd24e-152b-4750-951d-1506b5854df1-cert\") pod \"ingress-canary-whtpb\" (UID: \"b82dd24e-152b-4750-951d-1506b5854df1\") " pod="openshift-ingress-canary/ingress-canary-whtpb" Apr 24 21:28:43.566667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.566647 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npr4b\" (UniqueName: \"kubernetes.io/projected/b82dd24e-152b-4750-951d-1506b5854df1-kube-api-access-npr4b\") pod \"ingress-canary-whtpb\" (UID: \"b82dd24e-152b-4750-951d-1506b5854df1\") " pod="openshift-ingress-canary/ingress-canary-whtpb" Apr 24 21:28:43.650211 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.650192 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vqxb9" Apr 24 21:28:43.656985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.656955 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:43.746339 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.745840 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-whtpb" Apr 24 21:28:43.814396 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.814328 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vqxb9"] Apr 24 21:28:43.816964 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.816937 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rbrfl"] Apr 24 21:28:43.821057 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:43.821028 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fe12bc3_3098_4d2a_bf02_8982253438e3.slice/crio-10d324709fa505670053a424f634513a4b4ea7c7c631415777ea71ddb5f126ee WatchSource:0}: Error finding container 10d324709fa505670053a424f634513a4b4ea7c7c631415777ea71ddb5f126ee: Status 404 returned error can't find the container with id 10d324709fa505670053a424f634513a4b4ea7c7c631415777ea71ddb5f126ee Apr 24 21:28:43.875100 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.874957 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-whtpb"] Apr 24 21:28:43.878370 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:43.878342 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb82dd24e_152b_4750_951d_1506b5854df1.slice/crio-90d30b4d6d2fa1201493e942060ae60d2999b6c87f9169f8dff6b818eef231bb WatchSource:0}: Error finding container 90d30b4d6d2fa1201493e942060ae60d2999b6c87f9169f8dff6b818eef231bb: Status 404 returned error can't find the container with id 90d30b4d6d2fa1201493e942060ae60d2999b6c87f9169f8dff6b818eef231bb Apr 24 21:28:43.917344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.917320 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-897c595bd-f7cd2"] Apr 24 21:28:43.922131 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.922105 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:43.924443 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.924424 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:28:43.924549 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.924487 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:28:43.924914 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.924759 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:28:43.924914 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.924766 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:28:43.924914 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.924778 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:28:43.924914 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.924795 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:28:43.924914 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.924808 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:28:43.924914 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.924781 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-rq5kh\"" Apr 24 21:28:43.930844 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.930824 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-897c595bd-f7cd2"] Apr 24 21:28:43.959794 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.959770 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mlv9\" (UniqueName: \"kubernetes.io/projected/516e7cc4-5f56-453f-be11-80d450f1323f-kube-api-access-2mlv9\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:43.959902 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.959833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-console-config\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:43.959902 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.959873 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-oauth-serving-cert\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:43.959999 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.959908 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-serving-cert\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:43.959999 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.959935 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-oauth-config\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:43.959999 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:43.959966 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-service-ca\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.061389 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-serving-cert\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.061460 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-oauth-config\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.061507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-service-ca\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.061565 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mlv9\" (UniqueName: \"kubernetes.io/projected/516e7cc4-5f56-453f-be11-80d450f1323f-kube-api-access-2mlv9\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.061644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-console-config\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.061692 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-oauth-serving-cert\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.062446 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-oauth-serving-cert\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.062640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-service-ca\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.064623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.063334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-console-config\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.065970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.065942 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-oauth-config\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.066084 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.066025 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-serving-cert\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.071388 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.071366 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mlv9\" (UniqueName: \"kubernetes.io/projected/516e7cc4-5f56-453f-be11-80d450f1323f-kube-api-access-2mlv9\") pod \"console-897c595bd-f7cd2\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.107314 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.107278 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-whtpb" event={"ID":"b82dd24e-152b-4750-951d-1506b5854df1","Type":"ContainerStarted","Data":"90d30b4d6d2fa1201493e942060ae60d2999b6c87f9169f8dff6b818eef231bb"} Apr 24 21:28:44.108245 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.108224 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rbrfl" event={"ID":"7fe12bc3-3098-4d2a-bf02-8982253438e3","Type":"ContainerStarted","Data":"10d324709fa505670053a424f634513a4b4ea7c7c631415777ea71ddb5f126ee"} Apr 24 21:28:44.109367 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.109348 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vqxb9" event={"ID":"2bdbfe10-ce76-4858-b4e6-d9443744ee5d","Type":"ContainerStarted","Data":"71b50bff8fa3bcdce13db3211e10202c53f0282600a9e8c15861f5707fe417a2"} Apr 24 21:28:44.109439 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.109374 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vqxb9" event={"ID":"2bdbfe10-ce76-4858-b4e6-d9443744ee5d","Type":"ContainerStarted","Data":"2d69adcdb7f75b6c69e13e33a913d8e21cc7833254f8e8283b8a77694fdad7c8"} Apr 24 21:28:44.230958 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.230921 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:44.387494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.387386 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-897c595bd-f7cd2"] Apr 24 21:28:44.403101 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:44.403071 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod516e7cc4_5f56_453f_be11_80d450f1323f.slice/crio-c2cd8810c42170802ce6ce484a25a6b31c7b41388457376f609869d712879894 WatchSource:0}: Error finding container c2cd8810c42170802ce6ce484a25a6b31c7b41388457376f609869d712879894: Status 404 returned error can't find the container with id c2cd8810c42170802ce6ce484a25a6b31c7b41388457376f609869d712879894 Apr 24 21:28:44.911823 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.911782 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:28:44.912001 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.911791 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:28:44.915572 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.915548 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:44.916102 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.916084 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:44.916454 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.916434 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:44.916568 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.916475 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-q7cg2\"" Apr 24 21:28:44.916568 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:44.916501 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7528m\"" Apr 24 21:28:45.112983 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:45.112931 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-897c595bd-f7cd2" event={"ID":"516e7cc4-5f56-453f-be11-80d450f1323f","Type":"ContainerStarted","Data":"c2cd8810c42170802ce6ce484a25a6b31c7b41388457376f609869d712879894"} Apr 24 21:28:47.118430 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.118184 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-whtpb" event={"ID":"b82dd24e-152b-4750-951d-1506b5854df1","Type":"ContainerStarted","Data":"1a6b8cd4c0140ce410375b6300b268e4903415b89cc070bcc54e31222c553459"} Apr 24 21:28:47.119611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.119583 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rbrfl" event={"ID":"7fe12bc3-3098-4d2a-bf02-8982253438e3","Type":"ContainerStarted","Data":"3b903a3029b5a4975cfac55cf1c0e2c6bee95044d5b0176e7e49134903a7ffb8"} Apr 24 21:28:47.121270 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.121248 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vqxb9" event={"ID":"2bdbfe10-ce76-4858-b4e6-d9443744ee5d","Type":"ContainerStarted","Data":"204812bc21be8e0e784ad6a66e559f3b61277f23fc628eb95c33af90a7e3239f"} Apr 24 21:28:47.143208 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.143161 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-whtpb" podStartSLOduration=1.865201527 podStartE2EDuration="4.14314399s" podCreationTimestamp="2026-04-24 21:28:43 +0000 UTC" firstStartedPulling="2026-04-24 21:28:43.880196065 +0000 UTC m=+48.483264824" lastFinishedPulling="2026-04-24 21:28:46.158138523 +0000 UTC m=+50.761207287" observedRunningTime="2026-04-24 21:28:47.14199849 +0000 UTC m=+51.745067272" watchObservedRunningTime="2026-04-24 21:28:47.14314399 +0000 UTC m=+51.746212771" Apr 24 21:28:47.459232 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.459195 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5"] Apr 24 21:28:47.462613 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.462543 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.466368 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.465969 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:28:47.466368 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.466121 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:28:47.466368 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.466211 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:47.473779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.472101 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:28:47.473779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.472350 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:47.473779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.473035 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2zfkh\"" Apr 24 21:28:47.473999 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.473794 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5"] Apr 24 21:28:47.492781 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.492758 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/19d3d197-f883-46f9-95f2-a59482064cd0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.492919 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.492796 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19d3d197-f883-46f9-95f2-a59482064cd0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.492919 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.492843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4wh6\" (UniqueName: \"kubernetes.io/projected/19d3d197-f883-46f9-95f2-a59482064cd0-kube-api-access-l4wh6\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.493026 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.492959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/19d3d197-f883-46f9-95f2-a59482064cd0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.521427 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.521398 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-n9hgt"] Apr 24 21:28:47.524576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.524551 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.528214 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.528191 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:28:47.529280 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.529258 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-zv9lx\"" Apr 24 21:28:47.529521 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.529503 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:28:47.529619 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.529544 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:28:47.546815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.546775 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-n9hgt"] Apr 24 21:28:47.547822 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.547736 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k927f"] Apr 24 21:28:47.550954 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.550937 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.553562 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.553539 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sgkz5\"" Apr 24 21:28:47.553659 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.553565 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:47.553752 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.553735 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:47.553868 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.553852 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:47.594210 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594179 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-sys\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594349 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594218 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1edf281d-115d-4f99-b5a2-1ad03eedf97d-metrics-client-ca\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594349 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594306 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sggpq\" (UniqueName: \"kubernetes.io/projected/1edf281d-115d-4f99-b5a2-1ad03eedf97d-kube-api-access-sggpq\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594462 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l82kk\" (UniqueName: \"kubernetes.io/projected/8092c8bb-c267-4686-9002-7a52d9a90961-kube-api-access-l82kk\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.594462 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594380 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-accelerators-collector-config\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594462 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594406 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-root\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594462 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594430 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-textfile\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594462 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594456 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8092c8bb-c267-4686-9002-7a52d9a90961-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.594660 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594492 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594660 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594570 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/19d3d197-f883-46f9-95f2-a59482064cd0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.594660 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8092c8bb-c267-4686-9002-7a52d9a90961-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.594660 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/19d3d197-f883-46f9-95f2-a59482064cd0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.594838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19d3d197-f883-46f9-95f2-a59482064cd0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.594838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594690 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-wtmp\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.594838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594787 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-tls\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.594838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4wh6\" (UniqueName: \"kubernetes.io/projected/19d3d197-f883-46f9-95f2-a59482064cd0-kube-api-access-l4wh6\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.595004 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.595004 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.594907 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.595577 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.595555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19d3d197-f883-46f9-95f2-a59482064cd0-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.598642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.598620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/19d3d197-f883-46f9-95f2-a59482064cd0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.598774 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.598683 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/19d3d197-f883-46f9-95f2-a59482064cd0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.611970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.611947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4wh6\" (UniqueName: \"kubernetes.io/projected/19d3d197-f883-46f9-95f2-a59482064cd0-kube-api-access-l4wh6\") pod \"openshift-state-metrics-9d44df66c-zg5f5\" (UID: \"19d3d197-f883-46f9-95f2-a59482064cd0\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.695992 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.695913 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8092c8bb-c267-4686-9002-7a52d9a90961-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.695992 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.695965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-wtmp\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696189 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.695995 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.696189 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696022 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-tls\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696189 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.696189 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696174 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-wtmp\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696370 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:47.696283 2568 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:28:47.696370 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:47.696345 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-tls podName:1edf281d-115d-4f99-b5a2-1ad03eedf97d nodeName:}" failed. No retries permitted until 2026-04-24 21:28:48.196321495 +0000 UTC m=+52.799390267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-tls") pod "node-exporter-k927f" (UID: "1edf281d-115d-4f99-b5a2-1ad03eedf97d") : secret "node-exporter-tls" not found Apr 24 21:28:47.696482 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696454 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.696534 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696508 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-sys\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696579 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696539 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1edf281d-115d-4f99-b5a2-1ad03eedf97d-metrics-client-ca\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696632 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696586 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sggpq\" (UniqueName: \"kubernetes.io/projected/1edf281d-115d-4f99-b5a2-1ad03eedf97d-kube-api-access-sggpq\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696632 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-sys\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696749 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l82kk\" (UniqueName: \"kubernetes.io/projected/8092c8bb-c267-4686-9002-7a52d9a90961-kube-api-access-l82kk\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.696749 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-accelerators-collector-config\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696856 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696749 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-root\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696856 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-textfile\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.696856 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696824 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8092c8bb-c267-4686-9002-7a52d9a90961-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.697009 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696622 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8092c8bb-c267-4686-9002-7a52d9a90961-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.697009 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.696965 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1edf281d-115d-4f99-b5a2-1ad03eedf97d-root\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.697250 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.697134 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1edf281d-115d-4f99-b5a2-1ad03eedf97d-metrics-client-ca\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.697250 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.697144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.697250 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.697191 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-textfile\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.697463 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.697445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8092c8bb-c267-4686-9002-7a52d9a90961-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.697645 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.697618 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.697813 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.697787 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-accelerators-collector-config\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.699000 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.698980 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.699782 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.699696 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.699870 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.699820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8092c8bb-c267-4686-9002-7a52d9a90961-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.705515 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.705468 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sggpq\" (UniqueName: \"kubernetes.io/projected/1edf281d-115d-4f99-b5a2-1ad03eedf97d-kube-api-access-sggpq\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:47.706379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.706350 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l82kk\" (UniqueName: \"kubernetes.io/projected/8092c8bb-c267-4686-9002-7a52d9a90961-kube-api-access-l82kk\") pod \"kube-state-metrics-69db897b98-n9hgt\" (UID: \"8092c8bb-c267-4686-9002-7a52d9a90961\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.773436 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.773404 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" Apr 24 21:28:47.834368 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.834340 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" Apr 24 21:28:47.909686 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.909453 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5"] Apr 24 21:28:47.973079 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:47.972994 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-n9hgt"] Apr 24 21:28:48.126547 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.126504 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rbrfl" event={"ID":"7fe12bc3-3098-4d2a-bf02-8982253438e3","Type":"ContainerStarted","Data":"aa3fc0227e107e5fcd0cb1611b839297773438c04a46b6953254572b2926bbeb"} Apr 24 21:28:48.127125 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.126640 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:48.128111 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.128082 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-897c595bd-f7cd2" event={"ID":"516e7cc4-5f56-453f-be11-80d450f1323f","Type":"ContainerStarted","Data":"880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60"} Apr 24 21:28:48.144336 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.144281 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rbrfl" podStartSLOduration=2.813361108 podStartE2EDuration="5.144265738s" podCreationTimestamp="2026-04-24 21:28:43 +0000 UTC" firstStartedPulling="2026-04-24 21:28:43.822780322 +0000 UTC m=+48.425849094" lastFinishedPulling="2026-04-24 21:28:46.153684954 +0000 UTC m=+50.756753724" observedRunningTime="2026-04-24 21:28:48.143084479 +0000 UTC m=+52.746153272" watchObservedRunningTime="2026-04-24 21:28:48.144265738 +0000 UTC m=+52.747334522" Apr 24 21:28:48.160257 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.160220 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-897c595bd-f7cd2" podStartSLOduration=2.227250037 podStartE2EDuration="5.160208923s" podCreationTimestamp="2026-04-24 21:28:43 +0000 UTC" firstStartedPulling="2026-04-24 21:28:44.405602575 +0000 UTC m=+49.008671335" lastFinishedPulling="2026-04-24 21:28:47.33856145 +0000 UTC m=+51.941630221" observedRunningTime="2026-04-24 21:28:48.159647703 +0000 UTC m=+52.762716488" watchObservedRunningTime="2026-04-24 21:28:48.160208923 +0000 UTC m=+52.763277703" Apr 24 21:28:48.201114 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.201085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-tls\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:48.204512 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.204494 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1edf281d-115d-4f99-b5a2-1ad03eedf97d-node-exporter-tls\") pod \"node-exporter-k927f\" (UID: \"1edf281d-115d-4f99-b5a2-1ad03eedf97d\") " pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:48.293519 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:48.293491 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d3d197_f883_46f9_95f2_a59482064cd0.slice/crio-8a4c4760c7e24f0cc2cfe07dd634724555ea2db149082ec0578503dd0b01919d WatchSource:0}: Error finding container 8a4c4760c7e24f0cc2cfe07dd634724555ea2db149082ec0578503dd0b01919d: Status 404 returned error can't find the container with id 8a4c4760c7e24f0cc2cfe07dd634724555ea2db149082ec0578503dd0b01919d Apr 24 21:28:48.294476 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:48.294056 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8092c8bb_c267_4686_9002_7a52d9a90961.slice/crio-acc641ab38a2d90ee9fe53403e14b64e3e3f6ce6c1c8fd835728ef4dd3dc0d51 WatchSource:0}: Error finding container acc641ab38a2d90ee9fe53403e14b64e3e3f6ce6c1c8fd835728ef4dd3dc0d51: Status 404 returned error can't find the container with id acc641ab38a2d90ee9fe53403e14b64e3e3f6ce6c1c8fd835728ef4dd3dc0d51 Apr 24 21:28:48.460435 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.460269 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k927f" Apr 24 21:28:48.468059 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:48.468033 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1edf281d_115d_4f99_b5a2_1ad03eedf97d.slice/crio-5c925f097e56f8dcff0c5338efcc1202ad41f588eeed263356ed9c3e25eeaa37 WatchSource:0}: Error finding container 5c925f097e56f8dcff0c5338efcc1202ad41f588eeed263356ed9c3e25eeaa37: Status 404 returned error can't find the container with id 5c925f097e56f8dcff0c5338efcc1202ad41f588eeed263356ed9c3e25eeaa37 Apr 24 21:28:48.550033 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.549945 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:48.554483 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.554467 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.556763 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.556741 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:28:48.556763 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.556755 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:28:48.556763 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.556769 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:28:48.557164 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.556892 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:28:48.557232 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.557108 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4kqqb\"" Apr 24 21:28:48.558284 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.557649 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:28:48.558284 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.557663 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:28:48.558284 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.557664 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:28:48.558284 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.557771 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:28:48.558284 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.557795 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:28:48.575097 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.575077 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:48.604386 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604360 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604484 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604390 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-web-config\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604484 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604484 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604460 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-out\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604484 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604476 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-volume\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604631 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604494 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604631 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604518 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604631 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604576 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604631 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604601 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhb5f\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-kube-api-access-lhb5f\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604837 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604643 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604837 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604661 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604837 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604683 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.604837 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.604700 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705065 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705039 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705214 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705214 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-web-config\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705214 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705214 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:48.705203 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle podName:32fdce74-05d6-4fd9-a65e-0a9f20c44cdc nodeName:}" failed. No retries permitted until 2026-04-24 21:28:49.205179474 +0000 UTC m=+53.808248247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc") : configmap references non-existent config key: ca-bundle.crt Apr 24 21:28:48.705509 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705488 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-out\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705567 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705521 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-volume\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705567 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705634 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705724 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:48.705689 2568 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 21:28:48.705828 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:48.705760 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls podName:32fdce74-05d6-4fd9-a65e-0a9f20c44cdc nodeName:}" failed. No retries permitted until 2026-04-24 21:28:49.205748626 +0000 UTC m=+53.808817385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc") : secret "alertmanager-main-tls" not found Apr 24 21:28:48.705828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705808 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhb5f\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-kube-api-access-lhb5f\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.705828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.706058 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705844 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.706058 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.705874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.706747 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.706379 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.707782 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.707758 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.709317 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.709293 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-out\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.709317 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.709308 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.709470 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.709335 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.709470 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.709394 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.709582 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.709521 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.710219 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.710199 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-web-config\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.710315 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.710203 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-volume\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.710315 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.710274 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:48.716349 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:48.716320 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhb5f\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-kube-api-access-lhb5f\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:49.133561 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.133510 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" event={"ID":"8092c8bb-c267-4686-9002-7a52d9a90961","Type":"ContainerStarted","Data":"acc641ab38a2d90ee9fe53403e14b64e3e3f6ce6c1c8fd835728ef4dd3dc0d51"} Apr 24 21:28:49.135786 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.135752 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vqxb9" event={"ID":"2bdbfe10-ce76-4858-b4e6-d9443744ee5d","Type":"ContainerStarted","Data":"fcc3f13a1f3be50706dabed927ffd4bfc97f65b8fce77cc728c0ebf5e9ec4179"} Apr 24 21:28:49.137062 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.137032 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k927f" event={"ID":"1edf281d-115d-4f99-b5a2-1ad03eedf97d","Type":"ContainerStarted","Data":"5c925f097e56f8dcff0c5338efcc1202ad41f588eeed263356ed9c3e25eeaa37"} Apr 24 21:28:49.138723 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.138680 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" event={"ID":"19d3d197-f883-46f9-95f2-a59482064cd0","Type":"ContainerStarted","Data":"b8992e7272a98ef1aa66192c346d2d1f850b2990cee3bd5b12651e6f0408ee0b"} Apr 24 21:28:49.138723 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.138724 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" event={"ID":"19d3d197-f883-46f9-95f2-a59482064cd0","Type":"ContainerStarted","Data":"6d5c2036b281712a6a8c32b60431c1c8a3d8cd6f117e2f572d6660261fbca9ed"} Apr 24 21:28:49.138944 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.138739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" event={"ID":"19d3d197-f883-46f9-95f2-a59482064cd0","Type":"ContainerStarted","Data":"8a4c4760c7e24f0cc2cfe07dd634724555ea2db149082ec0578503dd0b01919d"} Apr 24 21:28:49.155268 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.155220 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vqxb9" podStartSLOduration=1.7100651569999998 podStartE2EDuration="6.155205677s" podCreationTimestamp="2026-04-24 21:28:43 +0000 UTC" firstStartedPulling="2026-04-24 21:28:43.893873526 +0000 UTC m=+48.496942285" lastFinishedPulling="2026-04-24 21:28:48.339014034 +0000 UTC m=+52.942082805" observedRunningTime="2026-04-24 21:28:49.154286421 +0000 UTC m=+53.757355203" watchObservedRunningTime="2026-04-24 21:28:49.155205677 +0000 UTC m=+53.758274457" Apr 24 21:28:49.211888 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.211851 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:49.212213 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.212098 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:49.213296 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.213269 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:49.214745 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.214725 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:49.465825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.465735 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:28:49.568065 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.568029 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-64fd45769b-zll9c"] Apr 24 21:28:49.571798 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.571780 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.575233 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.575210 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:28:49.575724 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.575687 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:28:49.575724 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.575685 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:28:49.576147 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.576004 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-rjc4v\"" Apr 24 21:28:49.576147 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.576019 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-1qnh24ukilqg7\"" Apr 24 21:28:49.576147 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.576032 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:28:49.576147 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.576015 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:28:49.584247 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.584223 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-64fd45769b-zll9c"] Apr 24 21:28:49.615507 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.615465 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.615620 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.615558 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.615620 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.615597 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-tls\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.615752 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.615627 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.615805 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.615752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.615853 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.615799 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fce3a25-2bba-432f-984f-bedaa6e050c1-metrics-client-ca\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.615853 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.615826 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-grpc-tls\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.615951 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.615871 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt96w\" (UniqueName: \"kubernetes.io/projected/3fce3a25-2bba-432f-984f-bedaa6e050c1-kube-api-access-jt96w\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.716858 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.716787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.716858 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.716826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fce3a25-2bba-432f-984f-bedaa6e050c1-metrics-client-ca\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.716858 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.716843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-grpc-tls\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.717104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.716863 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt96w\" (UniqueName: \"kubernetes.io/projected/3fce3a25-2bba-432f-984f-bedaa6e050c1-kube-api-access-jt96w\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.717104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.716885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.717193 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.717118 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.717193 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.717166 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-tls\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.717292 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.717223 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.717555 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.717510 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fce3a25-2bba-432f-984f-bedaa6e050c1-metrics-client-ca\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.719484 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.719455 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.719577 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.719555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.720014 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.719994 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.720014 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.720006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.720178 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.720152 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-thanos-querier-tls\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.720265 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.720221 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3fce3a25-2bba-432f-984f-bedaa6e050c1-secret-grpc-tls\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.726490 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.726473 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt96w\" (UniqueName: \"kubernetes.io/projected/3fce3a25-2bba-432f-984f-bedaa6e050c1-kube-api-access-jt96w\") pod \"thanos-querier-64fd45769b-zll9c\" (UID: \"3fce3a25-2bba-432f-984f-bedaa6e050c1\") " pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.881948 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.881896 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:49.920667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:49.920627 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:28:50.046743 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.046693 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-64fd45769b-zll9c"] Apr 24 21:28:50.051516 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:50.051484 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fce3a25_2bba_432f_984f_bedaa6e050c1.slice/crio-7a2fd428e3277708e8ce7f662bb4ab876398e8372c9b16802bfc39a806e8a604 WatchSource:0}: Error finding container 7a2fd428e3277708e8ce7f662bb4ab876398e8372c9b16802bfc39a806e8a604: Status 404 returned error can't find the container with id 7a2fd428e3277708e8ce7f662bb4ab876398e8372c9b16802bfc39a806e8a604 Apr 24 21:28:50.142460 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.142425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" event={"ID":"3fce3a25-2bba-432f-984f-bedaa6e050c1","Type":"ContainerStarted","Data":"7a2fd428e3277708e8ce7f662bb4ab876398e8372c9b16802bfc39a806e8a604"} Apr 24 21:28:50.144251 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.144218 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" event={"ID":"19d3d197-f883-46f9-95f2-a59482064cd0","Type":"ContainerStarted","Data":"ed69ce5361d2183f389ff178abd6104946c0b388742c05933f65d1125db4f764"} Apr 24 21:28:50.146141 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.146120 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" event={"ID":"8092c8bb-c267-4686-9002-7a52d9a90961","Type":"ContainerStarted","Data":"0d12a326a28fbac5127e251d55fdee835323b831dcffe1d3fb3a749ddcd0587f"} Apr 24 21:28:50.146266 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.146144 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" event={"ID":"8092c8bb-c267-4686-9002-7a52d9a90961","Type":"ContainerStarted","Data":"421697d780492f68943697de42b42b7181bf4d1bc9d4e0624a786375485580c1"} Apr 24 21:28:50.146266 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.146154 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" event={"ID":"8092c8bb-c267-4686-9002-7a52d9a90961","Type":"ContainerStarted","Data":"0fd4fb9c8a64802e51d14544bcf34b6f73d2e89440b35ff2de3e9752ca8f44ed"} Apr 24 21:28:50.147174 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.147153 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerStarted","Data":"8d31837c9f91b6e58ae85aa8b1ad6576e5f55c6fe55419bc7f33852441f28371"} Apr 24 21:28:50.148743 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.148719 2568 generic.go:358] "Generic (PLEG): container finished" podID="1edf281d-115d-4f99-b5a2-1ad03eedf97d" containerID="e188a8858b073b21fbc72a670e73444a20baa6f3c6b6e01e7ba6f2f41081837d" exitCode=0 Apr 24 21:28:50.148852 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.148797 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k927f" event={"ID":"1edf281d-115d-4f99-b5a2-1ad03eedf97d","Type":"ContainerDied","Data":"e188a8858b073b21fbc72a670e73444a20baa6f3c6b6e01e7ba6f2f41081837d"} Apr 24 21:28:50.162754 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.162690 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zg5f5" podStartSLOduration=1.806720484 podStartE2EDuration="3.162675572s" podCreationTimestamp="2026-04-24 21:28:47 +0000 UTC" firstStartedPulling="2026-04-24 21:28:48.413857984 +0000 UTC m=+53.016926757" lastFinishedPulling="2026-04-24 21:28:49.769813072 +0000 UTC m=+54.372881845" observedRunningTime="2026-04-24 21:28:50.161859575 +0000 UTC m=+54.764928356" watchObservedRunningTime="2026-04-24 21:28:50.162675572 +0000 UTC m=+54.765744354" Apr 24 21:28:50.178992 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.178949 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-n9hgt" podStartSLOduration=1.704792168 podStartE2EDuration="3.178935577s" podCreationTimestamp="2026-04-24 21:28:47 +0000 UTC" firstStartedPulling="2026-04-24 21:28:48.29599918 +0000 UTC m=+52.899067943" lastFinishedPulling="2026-04-24 21:28:49.77014258 +0000 UTC m=+54.373211352" observedRunningTime="2026-04-24 21:28:50.178483507 +0000 UTC m=+54.781552289" watchObservedRunningTime="2026-04-24 21:28:50.178935577 +0000 UTC m=+54.782004359" Apr 24 21:28:50.705944 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.705886 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c5ddffb54-ltk5x"] Apr 24 21:28:50.709994 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.709605 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.719945 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.719884 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:28:50.720960 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.720835 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c5ddffb54-ltk5x"] Apr 24 21:28:50.827801 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.827764 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-oauth-serving-cert\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.827970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.827821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-config\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.827970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.827848 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-oauth-config\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.827970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.827873 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt6sh\" (UniqueName: \"kubernetes.io/projected/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-kube-api-access-zt6sh\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.827970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.827965 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-service-ca\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.828169 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.828035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-serving-cert\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.828169 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.828109 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-trusted-ca-bundle\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.929328 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.929296 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-service-ca\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.929501 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.929342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-serving-cert\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.929501 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.929382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-trusted-ca-bundle\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.929501 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.929406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-oauth-serving-cert\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.929501 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.929440 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-config\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.930078 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.930049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-oauth-config\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.930171 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.930132 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zt6sh\" (UniqueName: \"kubernetes.io/projected/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-kube-api-access-zt6sh\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.930784 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.930734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-oauth-serving-cert\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.930784 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.930734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-service-ca\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.932486 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.931206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-config\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.932486 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.931430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-trusted-ca-bundle\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.936805 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.936775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-serving-cert\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.938950 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.938894 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-oauth-config\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:50.940908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:50.940884 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt6sh\" (UniqueName: \"kubernetes.io/projected/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-kube-api-access-zt6sh\") pod \"console-6c5ddffb54-ltk5x\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:51.023953 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.023825 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:28:51.156115 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.156079 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k927f" event={"ID":"1edf281d-115d-4f99-b5a2-1ad03eedf97d","Type":"ContainerStarted","Data":"875a5de73a664d488414b3393f08047ad7dec5cc52c5190dd0b67a9c856334b2"} Apr 24 21:28:51.156498 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.156122 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k927f" event={"ID":"1edf281d-115d-4f99-b5a2-1ad03eedf97d","Type":"ContainerStarted","Data":"e03c84dde50d9a471397cc1dd7037e099bed690df8b93efd63a3625533bdf01d"} Apr 24 21:28:51.177344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.177292 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k927f" podStartSLOduration=2.877043441 podStartE2EDuration="4.177277206s" podCreationTimestamp="2026-04-24 21:28:47 +0000 UTC" firstStartedPulling="2026-04-24 21:28:48.469895719 +0000 UTC m=+53.072964479" lastFinishedPulling="2026-04-24 21:28:49.770129472 +0000 UTC m=+54.373198244" observedRunningTime="2026-04-24 21:28:51.176577865 +0000 UTC m=+55.779646686" watchObservedRunningTime="2026-04-24 21:28:51.177277206 +0000 UTC m=+55.780345989" Apr 24 21:28:51.231140 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.231112 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c5ddffb54-ltk5x"] Apr 24 21:28:51.241410 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:51.241386 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3f7371e_f0da_45c4_9ec2_1206b6bc9034.slice/crio-d920e6e61a177cbb00b8bee2842be0ddf6bc76e62bfad08d5f59e6dd078c71cd WatchSource:0}: Error finding container d920e6e61a177cbb00b8bee2842be0ddf6bc76e62bfad08d5f59e6dd078c71cd: Status 404 returned error can't find the container with id d920e6e61a177cbb00b8bee2842be0ddf6bc76e62bfad08d5f59e6dd078c71cd Apr 24 21:28:51.780000 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.779964 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7d7c9954dd-2z9dc"] Apr 24 21:28:51.818984 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.818957 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d7c9954dd-2z9dc"] Apr 24 21:28:51.819120 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.819087 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:51.822539 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.822424 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:28:51.822539 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.822461 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:28:51.822539 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.822424 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-f726b0vsvniug\"" Apr 24 21:28:51.822539 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.822535 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:28:51.822834 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.822468 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:28:51.822834 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.822492 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-74dnv\"" Apr 24 21:28:51.940473 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.940437 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a4eb4b4f-7919-4494-8c58-56a18513db6a-audit-log\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:51.940642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.940527 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6trl\" (UniqueName: \"kubernetes.io/projected/a4eb4b4f-7919-4494-8c58-56a18513db6a-kube-api-access-r6trl\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:51.940642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.940559 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-secret-metrics-server-tls\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:51.940642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.940594 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-secret-metrics-server-client-certs\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:51.940642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.940621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4eb4b4f-7919-4494-8c58-56a18513db6a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:51.940867 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.940673 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a4eb4b4f-7919-4494-8c58-56a18513db6a-metrics-server-audit-profiles\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:51.940867 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:51.940762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-client-ca-bundle\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.041779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.041682 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a4eb4b4f-7919-4494-8c58-56a18513db6a-metrics-server-audit-profiles\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.041779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.041750 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-client-ca-bundle\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.041987 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.041792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a4eb4b4f-7919-4494-8c58-56a18513db6a-audit-log\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.041987 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.041840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6trl\" (UniqueName: \"kubernetes.io/projected/a4eb4b4f-7919-4494-8c58-56a18513db6a-kube-api-access-r6trl\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.041987 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.041867 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-secret-metrics-server-tls\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.041987 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.041889 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-secret-metrics-server-client-certs\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.041987 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.041906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4eb4b4f-7919-4494-8c58-56a18513db6a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.042259 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.042232 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a4eb4b4f-7919-4494-8c58-56a18513db6a-audit-log\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.042655 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.042629 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a4eb4b4f-7919-4494-8c58-56a18513db6a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.042764 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.042692 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a4eb4b4f-7919-4494-8c58-56a18513db6a-metrics-server-audit-profiles\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.044242 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.044219 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-secret-metrics-server-tls\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.044318 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.044268 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-client-ca-bundle\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.044438 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.044417 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a4eb4b4f-7919-4494-8c58-56a18513db6a-secret-metrics-server-client-certs\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.051228 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.051206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6trl\" (UniqueName: \"kubernetes.io/projected/a4eb4b4f-7919-4494-8c58-56a18513db6a-kube-api-access-r6trl\") pod \"metrics-server-7d7c9954dd-2z9dc\" (UID: \"a4eb4b4f-7919-4494-8c58-56a18513db6a\") " pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.128876 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.128851 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:28:52.160432 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.160339 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c5ddffb54-ltk5x" event={"ID":"a3f7371e-f0da-45c4-9ec2-1206b6bc9034","Type":"ContainerStarted","Data":"f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c"} Apr 24 21:28:52.160432 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.160388 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c5ddffb54-ltk5x" event={"ID":"a3f7371e-f0da-45c4-9ec2-1206b6bc9034","Type":"ContainerStarted","Data":"d920e6e61a177cbb00b8bee2842be0ddf6bc76e62bfad08d5f59e6dd078c71cd"} Apr 24 21:28:52.190435 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.190392 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c5ddffb54-ltk5x" podStartSLOduration=2.190373855 podStartE2EDuration="2.190373855s" podCreationTimestamp="2026-04-24 21:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:52.189058289 +0000 UTC m=+56.792127070" watchObservedRunningTime="2026-04-24 21:28:52.190373855 +0000 UTC m=+56.793442636" Apr 24 21:28:52.236096 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.236062 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8"] Apr 24 21:28:52.247124 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.247103 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8"] Apr 24 21:28:52.247269 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.247216 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" Apr 24 21:28:52.249458 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.249433 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:28:52.249546 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.249491 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-dzwlm\"" Apr 24 21:28:52.254197 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.254171 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d7c9954dd-2z9dc"] Apr 24 21:28:52.300064 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:52.299999 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4eb4b4f_7919_4494_8c58_56a18513db6a.slice/crio-701ccb5dacc6c6595530ce71769487484a124096a5ebc0bf54265690104d8e92 WatchSource:0}: Error finding container 701ccb5dacc6c6595530ce71769487484a124096a5ebc0bf54265690104d8e92: Status 404 returned error can't find the container with id 701ccb5dacc6c6595530ce71769487484a124096a5ebc0bf54265690104d8e92 Apr 24 21:28:52.348482 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.348442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/701fdaca-2cf1-4a21-bb63-c14ca1774495-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w8ld8\" (UID: \"701fdaca-2cf1-4a21-bb63-c14ca1774495\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" Apr 24 21:28:52.449856 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.449822 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/701fdaca-2cf1-4a21-bb63-c14ca1774495-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w8ld8\" (UID: \"701fdaca-2cf1-4a21-bb63-c14ca1774495\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" Apr 24 21:28:52.449954 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:52.449931 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 21:28:52.449998 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:28:52.449986 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/701fdaca-2cf1-4a21-bb63-c14ca1774495-monitoring-plugin-cert podName:701fdaca-2cf1-4a21-bb63-c14ca1774495 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:52.949971481 +0000 UTC m=+57.553040247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/701fdaca-2cf1-4a21-bb63-c14ca1774495-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-w8ld8" (UID: "701fdaca-2cf1-4a21-bb63-c14ca1774495") : secret "monitoring-plugin-cert" not found Apr 24 21:28:52.955586 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.955497 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/701fdaca-2cf1-4a21-bb63-c14ca1774495-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w8ld8\" (UID: \"701fdaca-2cf1-4a21-bb63-c14ca1774495\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" Apr 24 21:28:52.958584 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:52.958558 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/701fdaca-2cf1-4a21-bb63-c14ca1774495-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w8ld8\" (UID: \"701fdaca-2cf1-4a21-bb63-c14ca1774495\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" Apr 24 21:28:53.157219 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.157184 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" Apr 24 21:28:53.165766 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.165650 2568 generic.go:358] "Generic (PLEG): container finished" podID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerID="547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c" exitCode=0 Apr 24 21:28:53.165766 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.165745 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerDied","Data":"547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c"} Apr 24 21:28:53.166984 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.166958 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" event={"ID":"a4eb4b4f-7919-4494-8c58-56a18513db6a","Type":"ContainerStarted","Data":"701ccb5dacc6c6595530ce71769487484a124096a5ebc0bf54265690104d8e92"} Apr 24 21:28:53.169746 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.169599 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" event={"ID":"3fce3a25-2bba-432f-984f-bedaa6e050c1","Type":"ContainerStarted","Data":"01a07a1f9d5c879b7c2a18959139ee9cd17e83150f9c78649fb3bfae7f46d824"} Apr 24 21:28:53.169746 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.169627 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" event={"ID":"3fce3a25-2bba-432f-984f-bedaa6e050c1","Type":"ContainerStarted","Data":"d453790c2b7c0380f5388418016476ad2745abec460c08eedbd315a52ba421e0"} Apr 24 21:28:53.169746 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.169640 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" event={"ID":"3fce3a25-2bba-432f-984f-bedaa6e050c1","Type":"ContainerStarted","Data":"24ec2497e54f9ef290f10ceb21764255ad4873667712a844146d10c18e92870f"} Apr 24 21:28:53.702993 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.702967 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:53.708250 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.706847 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.709262 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.709236 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:28:53.709366 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.709268 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:28:53.710103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.709784 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:28:53.710103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.709902 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:28:53.710103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.709948 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:28:53.710103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.710001 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3omq6bovccuc0\"" Apr 24 21:28:53.710103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.709948 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:28:53.710103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.710051 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:28:53.710574 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.710500 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-6p4cm\"" Apr 24 21:28:53.710921 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.710901 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:28:53.711041 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.711016 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:28:53.711155 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.711106 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:28:53.711361 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.711346 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:28:53.714791 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.714771 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:28:53.716025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.715997 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-897c595bd-f7cd2"] Apr 24 21:28:53.722005 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.721984 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:53.764700 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.764667 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.764808 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.764767 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.764808 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.764799 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.764914 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.764831 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.764914 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.764869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765014 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.764958 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765059 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cq98\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-kube-api-access-4cq98\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765093 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-web-config\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765197 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765128 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765197 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765159 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765197 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765189 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config-out\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765333 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765217 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765333 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765333 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765306 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765470 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765338 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765470 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765377 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.765470 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.765428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.861095 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.861065 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8"] Apr 24 21:28:53.865049 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:53.865026 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701fdaca_2cf1_4a21_bb63_c14ca1774495.slice/crio-243ec32f031d20c803a0521053edc19cfe958365475f6e82a9ab64f587b6f8c8 WatchSource:0}: Error finding container 243ec32f031d20c803a0521053edc19cfe958365475f6e82a9ab64f587b6f8c8: Status 404 returned error can't find the container with id 243ec32f031d20c803a0521053edc19cfe958365475f6e82a9ab64f587b6f8c8 Apr 24 21:28:53.866096 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866190 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866115 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866190 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866190 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866171 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866222 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866252 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866293 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cq98\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-kube-api-access-4cq98\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-web-config\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866530 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866347 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866530 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866372 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866530 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config-out\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866530 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866530 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866530 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.866530 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866524 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.867042 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866553 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.867042 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866595 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.867042 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.867042 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.866816 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.867394 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.867368 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.871613 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.868309 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.871613 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.868528 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.871613 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.869253 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.871613 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.869685 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.871613 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.871475 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.873922 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.873899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.874408 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.874273 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.874408 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.874357 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.874408 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.874396 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.874620 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.874495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-web-config\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.874758 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.874734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config-out\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.874867 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.874846 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.875014 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.874991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.875121 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.875093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.876543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.876520 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:53.878314 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:53.878243 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cq98\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-kube-api-access-4cq98\") pod \"prometheus-k8s-0\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:54.027149 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.027115 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:28:54.084778 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.084248 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c6xp2" Apr 24 21:28:54.175463 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.175415 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" event={"ID":"701fdaca-2cf1-4a21-bb63-c14ca1774495","Type":"ContainerStarted","Data":"243ec32f031d20c803a0521053edc19cfe958365475f6e82a9ab64f587b6f8c8"} Apr 24 21:28:54.177364 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.177327 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" event={"ID":"a4eb4b4f-7919-4494-8c58-56a18513db6a","Type":"ContainerStarted","Data":"a4a03c9b03f3f0e452982b7a2fd36326f21868f2888c937bde0227dc2b5d506f"} Apr 24 21:28:54.181106 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.181079 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" event={"ID":"3fce3a25-2bba-432f-984f-bedaa6e050c1","Type":"ContainerStarted","Data":"024f57f3089b8272dfe28fb52f3707a760067c98d03b40a286ba8154f1677409"} Apr 24 21:28:54.181222 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.181111 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" event={"ID":"3fce3a25-2bba-432f-984f-bedaa6e050c1","Type":"ContainerStarted","Data":"b24e247ce11e38e5fe0b68faf29ae1be89ee229cf164a72d82915a3e1ee5c163"} Apr 24 21:28:54.181222 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.181126 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" event={"ID":"3fce3a25-2bba-432f-984f-bedaa6e050c1","Type":"ContainerStarted","Data":"26220d7f068af54b2c71609e5bdad500625facb21b25bb200ff33c40b32e2b99"} Apr 24 21:28:54.181358 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.181335 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:28:54.211069 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.211015 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" podStartSLOduration=1.808883542 podStartE2EDuration="3.210996932s" podCreationTimestamp="2026-04-24 21:28:51 +0000 UTC" firstStartedPulling="2026-04-24 21:28:52.307641459 +0000 UTC m=+56.910710234" lastFinishedPulling="2026-04-24 21:28:53.70975485 +0000 UTC m=+58.312823624" observedRunningTime="2026-04-24 21:28:54.210483829 +0000 UTC m=+58.813552609" watchObservedRunningTime="2026-04-24 21:28:54.210996932 +0000 UTC m=+58.814065714" Apr 24 21:28:54.221084 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.221060 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:28:54.224194 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:28:54.224172 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod917dee55_1b79_41dc_9ff7_27c7e3d3f922.slice/crio-3b7914c1a4f6c9d788ea18193a96ab2cd686d17446e2b05c20fadfe098c354f4 WatchSource:0}: Error finding container 3b7914c1a4f6c9d788ea18193a96ab2cd686d17446e2b05c20fadfe098c354f4: Status 404 returned error can't find the container with id 3b7914c1a4f6c9d788ea18193a96ab2cd686d17446e2b05c20fadfe098c354f4 Apr 24 21:28:54.231316 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.231281 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:28:54.244342 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:54.244304 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" podStartSLOduration=1.59186375 podStartE2EDuration="5.244291299s" podCreationTimestamp="2026-04-24 21:28:49 +0000 UTC" firstStartedPulling="2026-04-24 21:28:50.053398247 +0000 UTC m=+54.656467013" lastFinishedPulling="2026-04-24 21:28:53.705825788 +0000 UTC m=+58.308894562" observedRunningTime="2026-04-24 21:28:54.24342554 +0000 UTC m=+58.846494359" watchObservedRunningTime="2026-04-24 21:28:54.244291299 +0000 UTC m=+58.847360079" Apr 24 21:28:55.185392 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:55.185316 2568 generic.go:358] "Generic (PLEG): container finished" podID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerID="cc6bb060f3f1774eda6376a8bbd0011fe34d8aea2c08ff97704d4720339b34c0" exitCode=0 Apr 24 21:28:55.185847 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:55.185406 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerDied","Data":"cc6bb060f3f1774eda6376a8bbd0011fe34d8aea2c08ff97704d4720339b34c0"} Apr 24 21:28:55.185847 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:55.185444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerStarted","Data":"3b7914c1a4f6c9d788ea18193a96ab2cd686d17446e2b05c20fadfe098c354f4"} Apr 24 21:28:55.187184 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:55.187160 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerStarted","Data":"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637"} Apr 24 21:28:56.192363 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.192322 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" event={"ID":"701fdaca-2cf1-4a21-bb63-c14ca1774495","Type":"ContainerStarted","Data":"86870554064b75667f8e4c8ab17a290bbc8814ca4cd3ddd49e48bd403dff9bb7"} Apr 24 21:28:56.192901 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.192512 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" Apr 24 21:28:56.195658 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.195633 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerStarted","Data":"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d"} Apr 24 21:28:56.195813 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.195661 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerStarted","Data":"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf"} Apr 24 21:28:56.195813 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.195675 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerStarted","Data":"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa"} Apr 24 21:28:56.195813 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.195688 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerStarted","Data":"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47"} Apr 24 21:28:56.195813 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.195699 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerStarted","Data":"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244"} Apr 24 21:28:56.198825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.198791 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" Apr 24 21:28:56.212843 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.212798 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8ld8" podStartSLOduration=2.825309487 podStartE2EDuration="4.212781725s" podCreationTimestamp="2026-04-24 21:28:52 +0000 UTC" firstStartedPulling="2026-04-24 21:28:53.867194889 +0000 UTC m=+58.470263664" lastFinishedPulling="2026-04-24 21:28:55.254667126 +0000 UTC m=+59.857735902" observedRunningTime="2026-04-24 21:28:56.211252632 +0000 UTC m=+60.814321416" watchObservedRunningTime="2026-04-24 21:28:56.212781725 +0000 UTC m=+60.815850502" Apr 24 21:28:56.249125 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:56.249080 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.405348642 podStartE2EDuration="8.249067014s" podCreationTimestamp="2026-04-24 21:28:48 +0000 UTC" firstStartedPulling="2026-04-24 21:28:49.934573754 +0000 UTC m=+54.537642543" lastFinishedPulling="2026-04-24 21:28:54.778292142 +0000 UTC m=+59.381360915" observedRunningTime="2026-04-24 21:28:56.247484381 +0000 UTC m=+60.850553165" watchObservedRunningTime="2026-04-24 21:28:56.249067014 +0000 UTC m=+60.852135796" Apr 24 21:28:58.141537 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:58.141510 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rbrfl" Apr 24 21:28:58.204658 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:58.204626 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerStarted","Data":"9ff1c74ed683f6d13dd49628162ec3104098ecd490bbbcb105b30414d8956a70"} Apr 24 21:28:58.204658 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:58.204661 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerStarted","Data":"2e004e8c489dec782f7d0667d5618d66113d9a8830374a51b209d8e2898f1f25"} Apr 24 21:28:58.204658 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:58.204671 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerStarted","Data":"8b61b11bf960ee0f16f5d579fb66806a0d68d1f51421a556c2a7e4db9cab5f10"} Apr 24 21:28:58.204974 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:58.204683 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerStarted","Data":"86149dc678aa8ae3bb31a4ad289b6f21d9dac0e46e42ae17d1f0243c7d501ba0"} Apr 24 21:28:58.204974 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:58.204694 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerStarted","Data":"2aaf82e4fc5197238e782988c7365b08b33b15eff95a59b0c8484b2cc4292dac"} Apr 24 21:28:58.204974 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:58.204727 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerStarted","Data":"eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c"} Apr 24 21:28:58.233237 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:58.233185 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.669815672 podStartE2EDuration="5.233167135s" podCreationTimestamp="2026-04-24 21:28:53 +0000 UTC" firstStartedPulling="2026-04-24 21:28:55.186661614 +0000 UTC m=+59.789730376" lastFinishedPulling="2026-04-24 21:28:57.750013079 +0000 UTC m=+62.353081839" observedRunningTime="2026-04-24 21:28:58.232812665 +0000 UTC m=+62.835881486" watchObservedRunningTime="2026-04-24 21:28:58.233167135 +0000 UTC m=+62.836235930" Apr 24 21:28:59.027551 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:28:59.027514 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:00.194382 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:00.194351 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-64fd45769b-zll9c" Apr 24 21:29:01.024186 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.024159 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:29:01.024186 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.024191 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:29:01.028865 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.028845 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:29:01.217633 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.217601 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:29:01.744791 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.744759 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:29:01.747262 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.747246 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:29:01.757217 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.757191 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ca2ae96-23c0-4771-ba4d-46f95e147eb7-metrics-certs\") pod \"network-metrics-daemon-xrlcl\" (UID: \"7ca2ae96-23c0-4771-ba4d-46f95e147eb7\") " pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:29:01.845993 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.845961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:29:01.848915 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.848897 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:29:01.859458 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.859443 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:29:01.869725 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:01.869682 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9mkd\" (UniqueName: \"kubernetes.io/projected/fcefb7d4-b431-40f5-a0ae-d52f3d85cf97-kube-api-access-j9mkd\") pod \"network-check-target-js4dn\" (UID: \"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97\") " pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:29:02.032375 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:02.032305 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7528m\"" Apr 24 21:29:02.037574 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:02.037558 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-q7cg2\"" Apr 24 21:29:02.040352 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:02.040339 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xrlcl" Apr 24 21:29:02.046046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:02.046024 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:29:02.171785 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:02.171750 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xrlcl"] Apr 24 21:29:02.176718 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:29:02.176678 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ca2ae96_23c0_4771_ba4d_46f95e147eb7.slice/crio-8f585af71fe972ad5722eefce92d72d3f742cc0d9d8bbb734bbd4d2b39f41c9c WatchSource:0}: Error finding container 8f585af71fe972ad5722eefce92d72d3f742cc0d9d8bbb734bbd4d2b39f41c9c: Status 404 returned error can't find the container with id 8f585af71fe972ad5722eefce92d72d3f742cc0d9d8bbb734bbd4d2b39f41c9c Apr 24 21:29:02.188907 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:02.188882 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-js4dn"] Apr 24 21:29:02.191041 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:29:02.191016 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcefb7d4_b431_40f5_a0ae_d52f3d85cf97.slice/crio-c4c9c234251fface5342fa8ab66c2ece893f7746146e0271161f868344a7c858 WatchSource:0}: Error finding container c4c9c234251fface5342fa8ab66c2ece893f7746146e0271161f868344a7c858: Status 404 returned error can't find the container with id c4c9c234251fface5342fa8ab66c2ece893f7746146e0271161f868344a7c858 Apr 24 21:29:02.218191 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:02.218162 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xrlcl" event={"ID":"7ca2ae96-23c0-4771-ba4d-46f95e147eb7","Type":"ContainerStarted","Data":"8f585af71fe972ad5722eefce92d72d3f742cc0d9d8bbb734bbd4d2b39f41c9c"} Apr 24 21:29:02.219159 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:02.219132 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-js4dn" event={"ID":"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97","Type":"ContainerStarted","Data":"c4c9c234251fface5342fa8ab66c2ece893f7746146e0271161f868344a7c858"} Apr 24 21:29:04.227614 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:04.227566 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xrlcl" event={"ID":"7ca2ae96-23c0-4771-ba4d-46f95e147eb7","Type":"ContainerStarted","Data":"87924ad1aff51cdc0a9019dd4dab40f22274eebd179305ed9b48ce6363aa62f5"} Apr 24 21:29:04.227614 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:04.227615 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xrlcl" event={"ID":"7ca2ae96-23c0-4771-ba4d-46f95e147eb7","Type":"ContainerStarted","Data":"04065c3fdb28b63f63da73a2c60728f850d215603c70aef4cd3aad50b2d434ef"} Apr 24 21:29:04.247864 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:04.247812 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xrlcl" podStartSLOduration=67.173383134 podStartE2EDuration="1m8.247792631s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:29:02.178574549 +0000 UTC m=+66.781643308" lastFinishedPulling="2026-04-24 21:29:03.252984033 +0000 UTC m=+67.856052805" observedRunningTime="2026-04-24 21:29:04.246317554 +0000 UTC m=+68.849386336" watchObservedRunningTime="2026-04-24 21:29:04.247792631 +0000 UTC m=+68.850861464" Apr 24 21:29:04.859724 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:04.859597 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c5ddffb54-ltk5x"] Apr 24 21:29:05.231668 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:05.231576 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-js4dn" event={"ID":"fcefb7d4-b431-40f5-a0ae-d52f3d85cf97","Type":"ContainerStarted","Data":"8ce8d648b481dff59461515594851e5b95cadc2e0aa45210b3e5ce12a06920ed"} Apr 24 21:29:05.231668 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:05.231637 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:29:05.254554 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:05.254505 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-js4dn" podStartSLOduration=66.444945201 podStartE2EDuration="1m9.254491438s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:29:02.193051433 +0000 UTC m=+66.796120192" lastFinishedPulling="2026-04-24 21:29:05.002597657 +0000 UTC m=+69.605666429" observedRunningTime="2026-04-24 21:29:05.252619753 +0000 UTC m=+69.855688535" watchObservedRunningTime="2026-04-24 21:29:05.254491438 +0000 UTC m=+69.857560286" Apr 24 21:29:12.129978 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:12.129945 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:29:12.130408 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:12.129988 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:29:18.742379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:18.742245 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-897c595bd-f7cd2" podUID="516e7cc4-5f56-453f-be11-80d450f1323f" containerName="console" containerID="cri-o://880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60" gracePeriod=15 Apr 24 21:29:19.039150 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.039128 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-897c595bd-f7cd2_516e7cc4-5f56-453f-be11-80d450f1323f/console/0.log" Apr 24 21:29:19.039249 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.039214 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:29:19.103685 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.103659 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-oauth-serving-cert\") pod \"516e7cc4-5f56-453f-be11-80d450f1323f\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " Apr 24 21:29:19.103825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.103737 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mlv9\" (UniqueName: \"kubernetes.io/projected/516e7cc4-5f56-453f-be11-80d450f1323f-kube-api-access-2mlv9\") pod \"516e7cc4-5f56-453f-be11-80d450f1323f\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " Apr 24 21:29:19.103825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.103767 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-serving-cert\") pod \"516e7cc4-5f56-453f-be11-80d450f1323f\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " Apr 24 21:29:19.103825 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.103813 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-service-ca\") pod \"516e7cc4-5f56-453f-be11-80d450f1323f\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " Apr 24 21:29:19.103960 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.103851 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-oauth-config\") pod \"516e7cc4-5f56-453f-be11-80d450f1323f\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " Apr 24 21:29:19.103960 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.103880 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-console-config\") pod \"516e7cc4-5f56-453f-be11-80d450f1323f\" (UID: \"516e7cc4-5f56-453f-be11-80d450f1323f\") " Apr 24 21:29:19.104076 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.104042 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "516e7cc4-5f56-453f-be11-80d450f1323f" (UID: "516e7cc4-5f56-453f-be11-80d450f1323f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:19.104274 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.104251 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-service-ca" (OuterVolumeSpecName: "service-ca") pod "516e7cc4-5f56-453f-be11-80d450f1323f" (UID: "516e7cc4-5f56-453f-be11-80d450f1323f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:19.104335 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.104311 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-console-config" (OuterVolumeSpecName: "console-config") pod "516e7cc4-5f56-453f-be11-80d450f1323f" (UID: "516e7cc4-5f56-453f-be11-80d450f1323f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:19.105996 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.105975 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/516e7cc4-5f56-453f-be11-80d450f1323f-kube-api-access-2mlv9" (OuterVolumeSpecName: "kube-api-access-2mlv9") pod "516e7cc4-5f56-453f-be11-80d450f1323f" (UID: "516e7cc4-5f56-453f-be11-80d450f1323f"). InnerVolumeSpecName "kube-api-access-2mlv9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:19.106260 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.106241 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "516e7cc4-5f56-453f-be11-80d450f1323f" (UID: "516e7cc4-5f56-453f-be11-80d450f1323f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:19.106313 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.106268 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "516e7cc4-5f56-453f-be11-80d450f1323f" (UID: "516e7cc4-5f56-453f-be11-80d450f1323f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:19.205298 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.205265 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2mlv9\" (UniqueName: \"kubernetes.io/projected/516e7cc4-5f56-453f-be11-80d450f1323f-kube-api-access-2mlv9\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:19.205298 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.205292 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-serving-cert\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:19.205298 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.205303 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-service-ca\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:19.205500 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.205312 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/516e7cc4-5f56-453f-be11-80d450f1323f-console-oauth-config\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:19.205500 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.205323 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-console-config\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:19.205500 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.205333 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/516e7cc4-5f56-453f-be11-80d450f1323f-oauth-serving-cert\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:19.273675 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.273646 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-897c595bd-f7cd2_516e7cc4-5f56-453f-be11-80d450f1323f/console/0.log" Apr 24 21:29:19.273846 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.273689 2568 generic.go:358] "Generic (PLEG): container finished" podID="516e7cc4-5f56-453f-be11-80d450f1323f" containerID="880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60" exitCode=2 Apr 24 21:29:19.273846 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.273759 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-897c595bd-f7cd2" event={"ID":"516e7cc4-5f56-453f-be11-80d450f1323f","Type":"ContainerDied","Data":"880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60"} Apr 24 21:29:19.273846 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.273789 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-897c595bd-f7cd2" Apr 24 21:29:19.273846 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.273806 2568 scope.go:117] "RemoveContainer" containerID="880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60" Apr 24 21:29:19.274035 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.273794 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-897c595bd-f7cd2" event={"ID":"516e7cc4-5f56-453f-be11-80d450f1323f","Type":"ContainerDied","Data":"c2cd8810c42170802ce6ce484a25a6b31c7b41388457376f609869d712879894"} Apr 24 21:29:19.282593 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.282568 2568 scope.go:117] "RemoveContainer" containerID="880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60" Apr 24 21:29:19.282883 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:19.282863 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60\": container with ID starting with 880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60 not found: ID does not exist" containerID="880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60" Apr 24 21:29:19.282929 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.282891 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60"} err="failed to get container status \"880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60\": rpc error: code = NotFound desc = could not find container \"880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60\": container with ID starting with 880d34fdd985413fdf5ffddc0b0be22f8c63c8e48ff725d7a0394e0b53286c60 not found: ID does not exist" Apr 24 21:29:19.308955 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.308929 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-897c595bd-f7cd2"] Apr 24 21:29:19.316314 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.316296 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-897c595bd-f7cd2"] Apr 24 21:29:19.920464 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:19.917756 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="516e7cc4-5f56-453f-be11-80d450f1323f" path="/var/lib/kubelet/pods/516e7cc4-5f56-453f-be11-80d450f1323f/volumes" Apr 24 21:29:29.881738 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:29.881679 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c5ddffb54-ltk5x" podUID="a3f7371e-f0da-45c4-9ec2-1206b6bc9034" containerName="console" containerID="cri-o://f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c" gracePeriod=15 Apr 24 21:29:30.119664 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.119642 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c5ddffb54-ltk5x_a3f7371e-f0da-45c4-9ec2-1206b6bc9034/console/0.log" Apr 24 21:29:30.119785 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.119723 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:29:30.198564 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.198484 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-oauth-config\") pod \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " Apr 24 21:29:30.198564 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.198539 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-config\") pod \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " Apr 24 21:29:30.198797 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.198575 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-trusted-ca-bundle\") pod \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " Apr 24 21:29:30.198797 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.198618 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-service-ca\") pod \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " Apr 24 21:29:30.198797 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.198639 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-serving-cert\") pod \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " Apr 24 21:29:30.198797 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.198677 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-oauth-serving-cert\") pod \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " Apr 24 21:29:30.198797 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.198700 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt6sh\" (UniqueName: \"kubernetes.io/projected/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-kube-api-access-zt6sh\") pod \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\" (UID: \"a3f7371e-f0da-45c4-9ec2-1206b6bc9034\") " Apr 24 21:29:30.199123 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.199092 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-service-ca" (OuterVolumeSpecName: "service-ca") pod "a3f7371e-f0da-45c4-9ec2-1206b6bc9034" (UID: "a3f7371e-f0da-45c4-9ec2-1206b6bc9034"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:30.199123 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.199109 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a3f7371e-f0da-45c4-9ec2-1206b6bc9034" (UID: "a3f7371e-f0da-45c4-9ec2-1206b6bc9034"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:30.199258 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.199097 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-config" (OuterVolumeSpecName: "console-config") pod "a3f7371e-f0da-45c4-9ec2-1206b6bc9034" (UID: "a3f7371e-f0da-45c4-9ec2-1206b6bc9034"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:30.199258 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.199144 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a3f7371e-f0da-45c4-9ec2-1206b6bc9034" (UID: "a3f7371e-f0da-45c4-9ec2-1206b6bc9034"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:30.200631 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.200611 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a3f7371e-f0da-45c4-9ec2-1206b6bc9034" (UID: "a3f7371e-f0da-45c4-9ec2-1206b6bc9034"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:30.200878 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.200850 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a3f7371e-f0da-45c4-9ec2-1206b6bc9034" (UID: "a3f7371e-f0da-45c4-9ec2-1206b6bc9034"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:30.201068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.201047 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-kube-api-access-zt6sh" (OuterVolumeSpecName: "kube-api-access-zt6sh") pod "a3f7371e-f0da-45c4-9ec2-1206b6bc9034" (UID: "a3f7371e-f0da-45c4-9ec2-1206b6bc9034"). InnerVolumeSpecName "kube-api-access-zt6sh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:30.299506 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.299474 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-oauth-config\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:30.299506 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.299501 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-config\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:30.299506 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.299510 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-trusted-ca-bundle\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:30.299741 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.299519 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-service-ca\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:30.299741 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.299528 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-console-serving-cert\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:30.299741 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.299537 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-oauth-serving-cert\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:30.299741 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.299545 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zt6sh\" (UniqueName: \"kubernetes.io/projected/a3f7371e-f0da-45c4-9ec2-1206b6bc9034-kube-api-access-zt6sh\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:30.307791 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.307770 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c5ddffb54-ltk5x_a3f7371e-f0da-45c4-9ec2-1206b6bc9034/console/0.log" Apr 24 21:29:30.307915 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.307812 2568 generic.go:358] "Generic (PLEG): container finished" podID="a3f7371e-f0da-45c4-9ec2-1206b6bc9034" containerID="f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c" exitCode=2 Apr 24 21:29:30.307915 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.307879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c5ddffb54-ltk5x" event={"ID":"a3f7371e-f0da-45c4-9ec2-1206b6bc9034","Type":"ContainerDied","Data":"f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c"} Apr 24 21:29:30.307915 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.307896 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c5ddffb54-ltk5x" Apr 24 21:29:30.307915 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.307911 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c5ddffb54-ltk5x" event={"ID":"a3f7371e-f0da-45c4-9ec2-1206b6bc9034","Type":"ContainerDied","Data":"d920e6e61a177cbb00b8bee2842be0ddf6bc76e62bfad08d5f59e6dd078c71cd"} Apr 24 21:29:30.308095 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.307935 2568 scope.go:117] "RemoveContainer" containerID="f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c" Apr 24 21:29:30.316315 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.316294 2568 scope.go:117] "RemoveContainer" containerID="f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c" Apr 24 21:29:30.316562 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:30.316543 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c\": container with ID starting with f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c not found: ID does not exist" containerID="f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c" Apr 24 21:29:30.316612 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.316570 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c"} err="failed to get container status \"f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c\": rpc error: code = NotFound desc = could not find container \"f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c\": container with ID starting with f69b09e143928a236bb37fd811ed49fe20c34bb13ffbc643afd2f0b53928a66c not found: ID does not exist" Apr 24 21:29:30.332269 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.332249 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c5ddffb54-ltk5x"] Apr 24 21:29:30.337013 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:30.336993 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c5ddffb54-ltk5x"] Apr 24 21:29:31.916570 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:31.916540 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f7371e-f0da-45c4-9ec2-1206b6bc9034" path="/var/lib/kubelet/pods/a3f7371e-f0da-45c4-9ec2-1206b6bc9034/volumes" Apr 24 21:29:32.134576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:32.134551 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:29:32.138301 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:32.138280 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7d7c9954dd-2z9dc" Apr 24 21:29:36.236808 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:36.236776 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-js4dn" Apr 24 21:29:52.315334 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:52.315303 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:52.334271 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:52.334247 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:52.390118 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:52.390091 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:57.917841 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:57.917810 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:57.918988 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:57.918930 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="alertmanager" containerID="cri-o://2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637" gracePeriod=120 Apr 24 21:29:57.919206 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:57.918987 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy-metric" containerID="cri-o://370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf" gracePeriod=120 Apr 24 21:29:57.919519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:57.919012 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy-web" containerID="cri-o://ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47" gracePeriod=120 Apr 24 21:29:57.919519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:57.919021 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy" containerID="cri-o://13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa" gracePeriod=120 Apr 24 21:29:57.919519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:57.919062 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="config-reloader" containerID="cri-o://3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244" gracePeriod=120 Apr 24 21:29:57.919519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:57.919071 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="prom-label-proxy" containerID="cri-o://29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d" gracePeriod=120 Apr 24 21:29:58.397648 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:58.397617 2568 generic.go:358] "Generic (PLEG): container finished" podID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerID="29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d" exitCode=0 Apr 24 21:29:58.397648 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:58.397641 2568 generic.go:358] "Generic (PLEG): container finished" podID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerID="13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa" exitCode=0 Apr 24 21:29:58.397648 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:58.397648 2568 generic.go:358] "Generic (PLEG): container finished" podID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerID="3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244" exitCode=0 Apr 24 21:29:58.397648 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:58.397653 2568 generic.go:358] "Generic (PLEG): container finished" podID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerID="2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637" exitCode=0 Apr 24 21:29:58.398027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:58.397687 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerDied","Data":"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d"} Apr 24 21:29:58.398027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:58.397768 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerDied","Data":"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa"} Apr 24 21:29:58.398027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:58.397780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerDied","Data":"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244"} Apr 24 21:29:58.398027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:58.397789 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerDied","Data":"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637"} Apr 24 21:29:59.187558 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.187531 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.241301 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241273 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.241476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241332 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-out\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.241476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241358 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-web\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.241476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241384 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.241476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241423 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.241476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241463 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-metrics-client-ca\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.241747 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241487 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-tls-assets\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.241912 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241847 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:59.241912 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241904 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-main-db\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.242076 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.241957 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-volume\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.242133 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.242088 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-cluster-tls-config\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.242186 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.242131 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.242186 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.242174 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhb5f\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-kube-api-access-lhb5f\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.242294 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.242202 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-web-config\") pod \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\" (UID: \"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc\") " Apr 24 21:29:59.242347 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.242288 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:59.242478 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.242452 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-main-db\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.242478 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.242484 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.242855 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.242828 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:59.244995 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.244951 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:59.245267 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.245233 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:59.245456 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.245433 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:59.246464 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.246435 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:59.246548 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.246512 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-out" (OuterVolumeSpecName: "config-out") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:59.246908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.246875 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:59.247640 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.247576 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-kube-api-access-lhb5f" (OuterVolumeSpecName: "kube-api-access-lhb5f") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "kube-api-access-lhb5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:59.247640 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.247614 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-volume" (OuterVolumeSpecName: "config-volume") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:59.251444 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.251303 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:59.260760 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.258768 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-web-config" (OuterVolumeSpecName: "web-config") pod "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" (UID: "32fdce74-05d6-4fd9-a65e-0a9f20c44cdc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:59.343617 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343583 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-metrics-client-ca\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343617 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343614 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-tls-assets\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343617 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343623 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-volume\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343632 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-cluster-tls-config\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343642 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343651 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lhb5f\" (UniqueName: \"kubernetes.io/projected/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-kube-api-access-lhb5f\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343660 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-web-config\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343668 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-main-tls\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343678 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-config-out\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343687 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.343874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.343697 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:29:59.405402 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.405362 2568 generic.go:358] "Generic (PLEG): container finished" podID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerID="370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf" exitCode=0 Apr 24 21:29:59.413732 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.405916 2568 generic.go:358] "Generic (PLEG): container finished" podID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerID="ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47" exitCode=0 Apr 24 21:29:59.413732 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.405768 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.413732 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.405791 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerDied","Data":"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf"} Apr 24 21:29:59.414004 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.413753 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerDied","Data":"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47"} Apr 24 21:29:59.414004 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.413775 2568 scope.go:117] "RemoveContainer" containerID="29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d" Apr 24 21:29:59.414004 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.413778 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32fdce74-05d6-4fd9-a65e-0a9f20c44cdc","Type":"ContainerDied","Data":"8d31837c9f91b6e58ae85aa8b1ad6576e5f55c6fe55419bc7f33852441f28371"} Apr 24 21:29:59.421499 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.421480 2568 scope.go:117] "RemoveContainer" containerID="370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf" Apr 24 21:29:59.434467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.434428 2568 scope.go:117] "RemoveContainer" containerID="13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa" Apr 24 21:29:59.440742 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.440695 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:59.444022 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.443967 2568 scope.go:117] "RemoveContainer" containerID="ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47" Apr 24 21:29:59.446314 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.446291 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:59.451888 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.451868 2568 scope.go:117] "RemoveContainer" containerID="3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244" Apr 24 21:29:59.459440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.459365 2568 scope.go:117] "RemoveContainer" containerID="2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637" Apr 24 21:29:59.479050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.478959 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:59.479050 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.478994 2568 scope.go:117] "RemoveContainer" containerID="547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c" Apr 24 21:29:59.479379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479362 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy-metric" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479381 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy-metric" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479394 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="alertmanager" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479403 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="alertmanager" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479413 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479421 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479441 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3f7371e-f0da-45c4-9ec2-1206b6bc9034" containerName="console" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479449 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f7371e-f0da-45c4-9ec2-1206b6bc9034" containerName="console" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479457 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="prom-label-proxy" Apr 24 21:29:59.479467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479465 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="prom-label-proxy" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479481 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="config-reloader" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479489 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="config-reloader" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479505 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy-web" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479514 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy-web" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479524 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="init-config-reloader" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479532 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="init-config-reloader" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479544 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="516e7cc4-5f56-453f-be11-80d450f1323f" containerName="console" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479553 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="516e7cc4-5f56-453f-be11-80d450f1323f" containerName="console" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479624 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy-metric" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479637 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="config-reloader" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479648 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="prom-label-proxy" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479660 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="alertmanager" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479669 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy-web" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479681 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3f7371e-f0da-45c4-9ec2-1206b6bc9034" containerName="console" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479691 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" containerName="kube-rbac-proxy" Apr 24 21:29:59.479908 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.479701 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="516e7cc4-5f56-453f-be11-80d450f1323f" containerName="console" Apr 24 21:29:59.482851 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.482818 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.485227 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.485202 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:29:59.485326 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.485237 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:29:59.485907 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.485795 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:29:59.485907 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.485806 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:29:59.486104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.486087 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:29:59.486233 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.486220 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:29:59.486304 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.486287 2568 scope.go:117] "RemoveContainer" containerID="29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d" Apr 24 21:29:59.486358 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.486300 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4kqqb\"" Apr 24 21:29:59.486563 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.486526 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:29:59.486647 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.486578 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:29:59.486894 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:59.486687 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d\": container with ID starting with 29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d not found: ID does not exist" containerID="29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d" Apr 24 21:29:59.486894 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.486745 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d"} err="failed to get container status \"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d\": rpc error: code = NotFound desc = could not find container \"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d\": container with ID starting with 29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d not found: ID does not exist" Apr 24 21:29:59.486894 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.486776 2568 scope.go:117] "RemoveContainer" containerID="370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf" Apr 24 21:29:59.487094 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:59.487076 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf\": container with ID starting with 370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf not found: ID does not exist" containerID="370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf" Apr 24 21:29:59.487141 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.487108 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf"} err="failed to get container status \"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf\": rpc error: code = NotFound desc = could not find container \"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf\": container with ID starting with 370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf not found: ID does not exist" Apr 24 21:29:59.487141 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.487130 2568 scope.go:117] "RemoveContainer" containerID="13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa" Apr 24 21:29:59.487452 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:59.487428 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa\": container with ID starting with 13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa not found: ID does not exist" containerID="13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa" Apr 24 21:29:59.487507 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.487460 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa"} err="failed to get container status \"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa\": rpc error: code = NotFound desc = could not find container \"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa\": container with ID starting with 13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa not found: ID does not exist" Apr 24 21:29:59.487507 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.487481 2568 scope.go:117] "RemoveContainer" containerID="ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47" Apr 24 21:29:59.487759 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:59.487737 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47\": container with ID starting with ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47 not found: ID does not exist" containerID="ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47" Apr 24 21:29:59.487845 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.487766 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47"} err="failed to get container status \"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47\": rpc error: code = NotFound desc = could not find container \"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47\": container with ID starting with ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47 not found: ID does not exist" Apr 24 21:29:59.487845 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.487786 2568 scope.go:117] "RemoveContainer" containerID="3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244" Apr 24 21:29:59.488030 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:59.487981 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244\": container with ID starting with 3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244 not found: ID does not exist" containerID="3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244" Apr 24 21:29:59.488030 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.488015 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244"} err="failed to get container status \"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244\": rpc error: code = NotFound desc = could not find container \"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244\": container with ID starting with 3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244 not found: ID does not exist" Apr 24 21:29:59.488179 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.488037 2568 scope.go:117] "RemoveContainer" containerID="2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637" Apr 24 21:29:59.488255 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:59.488236 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637\": container with ID starting with 2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637 not found: ID does not exist" containerID="2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637" Apr 24 21:29:59.488302 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.488261 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637"} err="failed to get container status \"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637\": rpc error: code = NotFound desc = could not find container \"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637\": container with ID starting with 2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637 not found: ID does not exist" Apr 24 21:29:59.488302 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.488280 2568 scope.go:117] "RemoveContainer" containerID="547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c" Apr 24 21:29:59.488522 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:29:59.488504 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c\": container with ID starting with 547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c not found: ID does not exist" containerID="547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c" Apr 24 21:29:59.488588 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.488529 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c"} err="failed to get container status \"547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c\": rpc error: code = NotFound desc = could not find container \"547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c\": container with ID starting with 547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c not found: ID does not exist" Apr 24 21:29:59.488588 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.488550 2568 scope.go:117] "RemoveContainer" containerID="29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d" Apr 24 21:29:59.488815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.488790 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d"} err="failed to get container status \"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d\": rpc error: code = NotFound desc = could not find container \"29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d\": container with ID starting with 29fc7724954c6f8835cf5745bca429baa1e11829d9d9ecd77571bb4e0386080d not found: ID does not exist" Apr 24 21:29:59.488879 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.488817 2568 scope.go:117] "RemoveContainer" containerID="370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf" Apr 24 21:29:59.489108 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.489078 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf"} err="failed to get container status \"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf\": rpc error: code = NotFound desc = could not find container \"370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf\": container with ID starting with 370620e21eabbdf8023fc66953a27eedf2d5e0def6b5c782885ddab694f19bcf not found: ID does not exist" Apr 24 21:29:59.489158 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.489109 2568 scope.go:117] "RemoveContainer" containerID="13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa" Apr 24 21:29:59.489440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.489419 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa"} err="failed to get container status \"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa\": rpc error: code = NotFound desc = could not find container \"13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa\": container with ID starting with 13d27eb123cb60fe55072e1c84005ab45a3506886f71afa12de7f3ca2bf75efa not found: ID does not exist" Apr 24 21:29:59.489518 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.489441 2568 scope.go:117] "RemoveContainer" containerID="ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47" Apr 24 21:29:59.489694 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.489670 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47"} err="failed to get container status \"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47\": rpc error: code = NotFound desc = could not find container \"ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47\": container with ID starting with ab07f97ebbfa2eac49ac126a03feb1aae62c0292febe5b12801b616bb9009a47 not found: ID does not exist" Apr 24 21:29:59.489807 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.489696 2568 scope.go:117] "RemoveContainer" containerID="3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244" Apr 24 21:29:59.490054 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.489995 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244"} err="failed to get container status \"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244\": rpc error: code = NotFound desc = could not find container \"3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244\": container with ID starting with 3dce6a2fb0420e09753060103ffa66801f25cb511c02d9da324db82758e35244 not found: ID does not exist" Apr 24 21:29:59.490054 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.490022 2568 scope.go:117] "RemoveContainer" containerID="2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637" Apr 24 21:29:59.490342 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.490318 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637"} err="failed to get container status \"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637\": rpc error: code = NotFound desc = could not find container \"2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637\": container with ID starting with 2fb52bf53f82608e7ebe5ca7348b9d199316fb9289c975eaf438993cea103637 not found: ID does not exist" Apr 24 21:29:59.490419 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.490342 2568 scope.go:117] "RemoveContainer" containerID="547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c" Apr 24 21:29:59.490673 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.490649 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c"} err="failed to get container status \"547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c\": rpc error: code = NotFound desc = could not find container \"547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c\": container with ID starting with 547ca49ffded29710a2c5eff37a1d2a678efad13bf169c137ab262aa85db5c6c not found: ID does not exist" Apr 24 21:29:59.494377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.494360 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:29:59.505638 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.505589 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:59.544747 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.544871 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544757 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.544871 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544780 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1dc2849c-4a7c-4571-a621-370a0e01f551-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.544871 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544800 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.544871 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544829 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1dc2849c-4a7c-4571-a621-370a0e01f551-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.545031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544882 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdh98\" (UniqueName: \"kubernetes.io/projected/1dc2849c-4a7c-4571-a621-370a0e01f551-kube-api-access-tdh98\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.545031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544908 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1dc2849c-4a7c-4571-a621-370a0e01f551-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.545031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544925 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-config-volume\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.545031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544963 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.545031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.544992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc2849c-4a7c-4571-a621-370a0e01f551-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.545234 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.545049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-web-config\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.545234 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.545070 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.545234 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.545096 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1dc2849c-4a7c-4571-a621-370a0e01f551-config-out\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.645666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645633 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1dc2849c-4a7c-4571-a621-370a0e01f551-config-out\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.645851 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645675 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.645851 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645695 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.645851 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1dc2849c-4a7c-4571-a621-370a0e01f551-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.645851 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.645851 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1dc2849c-4a7c-4571-a621-370a0e01f551-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.645851 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645802 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdh98\" (UniqueName: \"kubernetes.io/projected/1dc2849c-4a7c-4571-a621-370a0e01f551-kube-api-access-tdh98\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.645851 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1dc2849c-4a7c-4571-a621-370a0e01f551-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.646203 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-config-volume\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.646203 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.646203 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645912 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc2849c-4a7c-4571-a621-370a0e01f551-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.646203 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645952 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-web-config\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.646203 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.645987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.647160 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.646514 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1dc2849c-4a7c-4571-a621-370a0e01f551-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.647160 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.646541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1dc2849c-4a7c-4571-a621-370a0e01f551-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.648796 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.648769 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.648916 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.648865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.648916 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.648875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1dc2849c-4a7c-4571-a621-370a0e01f551-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.648916 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.648877 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.649072 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.649006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1dc2849c-4a7c-4571-a621-370a0e01f551-config-out\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.649177 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.649154 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.649662 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.649640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1dc2849c-4a7c-4571-a621-370a0e01f551-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.650246 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.650211 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.650514 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.650489 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-config-volume\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.651068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.651049 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1dc2849c-4a7c-4571-a621-370a0e01f551-web-config\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.657737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.657688 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdh98\" (UniqueName: \"kubernetes.io/projected/1dc2849c-4a7c-4571-a621-370a0e01f551-kube-api-access-tdh98\") pod \"alertmanager-main-0\" (UID: \"1dc2849c-4a7c-4571-a621-370a0e01f551\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.794971 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.794871 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:59.919903 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.919867 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32fdce74-05d6-4fd9-a65e-0a9f20c44cdc" path="/var/lib/kubelet/pods/32fdce74-05d6-4fd9-a65e-0a9f20c44cdc/volumes" Apr 24 21:29:59.965438 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:29:59.959460 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:59.970830 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:29:59.970806 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc2849c_4a7c_4571_a621_370a0e01f551.slice/crio-1794bbc47563d07ebbb4f1748f6d43cee9d67c2a8d052c138a34386e452b4ef6 WatchSource:0}: Error finding container 1794bbc47563d07ebbb4f1748f6d43cee9d67c2a8d052c138a34386e452b4ef6: Status 404 returned error can't find the container with id 1794bbc47563d07ebbb4f1748f6d43cee9d67c2a8d052c138a34386e452b4ef6 Apr 24 21:30:00.420028 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:00.419957 2568 generic.go:358] "Generic (PLEG): container finished" podID="1dc2849c-4a7c-4571-a621-370a0e01f551" containerID="9550a4b32bfe0c83d83fd394c4c0fa54f6f5cbb0d8bfa5d45ce7ec60b5011bd6" exitCode=0 Apr 24 21:30:00.420358 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:00.420033 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1dc2849c-4a7c-4571-a621-370a0e01f551","Type":"ContainerDied","Data":"9550a4b32bfe0c83d83fd394c4c0fa54f6f5cbb0d8bfa5d45ce7ec60b5011bd6"} Apr 24 21:30:00.420358 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:00.420051 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1dc2849c-4a7c-4571-a621-370a0e01f551","Type":"ContainerStarted","Data":"1794bbc47563d07ebbb4f1748f6d43cee9d67c2a8d052c138a34386e452b4ef6"} Apr 24 21:30:01.425326 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.425290 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1dc2849c-4a7c-4571-a621-370a0e01f551","Type":"ContainerStarted","Data":"76585059124aefd5c94a10b57278e94e12df845a61d17f0565d9f8dd8520a8dc"} Apr 24 21:30:01.425799 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.425331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1dc2849c-4a7c-4571-a621-370a0e01f551","Type":"ContainerStarted","Data":"e89b950582951202875d558dd762f6d81b0c7f15af0a9dd63341a0f87af7a221"} Apr 24 21:30:01.425799 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.425344 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1dc2849c-4a7c-4571-a621-370a0e01f551","Type":"ContainerStarted","Data":"320a02d542a1248ef29098a93aeb9c6478a34fdab1502931059e45ccd77524b6"} Apr 24 21:30:01.425799 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.425353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1dc2849c-4a7c-4571-a621-370a0e01f551","Type":"ContainerStarted","Data":"f98209cfb5baeeed074e42868ca4a80ffd044c81cc3139cfc4289fa090601dae"} Apr 24 21:30:01.425799 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.425360 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1dc2849c-4a7c-4571-a621-370a0e01f551","Type":"ContainerStarted","Data":"9eef95457a85a22ed2df5d7837c11207622018fe429fa78d793be0256fe129c8"} Apr 24 21:30:01.425799 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.425370 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1dc2849c-4a7c-4571-a621-370a0e01f551","Type":"ContainerStarted","Data":"99ea33841965580d47b594bbb0233e62a81803bd037dfacd8061035c8ee19979"} Apr 24 21:30:01.461813 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.461753 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.461733207 podStartE2EDuration="2.461733207s" podCreationTimestamp="2026-04-24 21:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:01.459279585 +0000 UTC m=+126.062348406" watchObservedRunningTime="2026-04-24 21:30:01.461733207 +0000 UTC m=+126.064801990" Apr 24 21:30:01.971071 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.971038 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm"] Apr 24 21:30:01.973552 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.973535 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:01.975961 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.975933 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:30:01.976093 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.975963 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:30:01.976156 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.976095 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-8hrpz\"" Apr 24 21:30:01.976156 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.976113 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:30:01.976251 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.976214 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:30:01.976379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.976366 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:30:01.982144 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.982125 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:30:01.986309 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:01.986289 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm"] Apr 24 21:30:02.074548 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.074517 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kptg6\" (UniqueName: \"kubernetes.io/projected/ab725086-c0f4-4827-ba09-3bcd0a0470e0-kube-api-access-kptg6\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.074730 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.074562 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-secret-telemeter-client\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.074730 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.074634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-telemeter-client-tls\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.074730 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.074677 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-federate-client-tls\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.074730 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.074726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.074932 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.074825 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-serving-certs-ca-bundle\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.074932 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.074871 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.074932 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.074915 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-metrics-client-ca\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.175650 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.175621 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.175849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.175659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-metrics-client-ca\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.175849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.175698 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kptg6\" (UniqueName: \"kubernetes.io/projected/ab725086-c0f4-4827-ba09-3bcd0a0470e0-kube-api-access-kptg6\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.175849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.175729 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-secret-telemeter-client\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.175849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.175747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-telemeter-client-tls\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.175849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.175762 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-federate-client-tls\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.175849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.175778 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.175849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.175820 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-serving-certs-ca-bundle\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.176606 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.176579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-metrics-client-ca\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.176606 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.176602 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-serving-certs-ca-bundle\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.176815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.176636 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab725086-c0f4-4827-ba09-3bcd0a0470e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.178373 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.178350 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.178509 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.178488 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-federate-client-tls\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.178644 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.178628 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-telemeter-client-tls\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.178687 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.178630 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ab725086-c0f4-4827-ba09-3bcd0a0470e0-secret-telemeter-client\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.187250 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.187227 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kptg6\" (UniqueName: \"kubernetes.io/projected/ab725086-c0f4-4827-ba09-3bcd0a0470e0-kube-api-access-kptg6\") pod \"telemeter-client-6bbc6c965b-tzqbm\" (UID: \"ab725086-c0f4-4827-ba09-3bcd0a0470e0\") " pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.284080 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.284049 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" Apr 24 21:30:02.299040 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.299010 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:02.299496 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.299463 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="prometheus" containerID="cri-o://eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c" gracePeriod=600 Apr 24 21:30:02.299616 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.299508 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9ff1c74ed683f6d13dd49628162ec3104098ecd490bbbcb105b30414d8956a70" gracePeriod=600 Apr 24 21:30:02.299616 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.299488 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy" containerID="cri-o://2e004e8c489dec782f7d0667d5618d66113d9a8830374a51b209d8e2898f1f25" gracePeriod=600 Apr 24 21:30:02.299616 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.299603 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy-web" containerID="cri-o://8b61b11bf960ee0f16f5d579fb66806a0d68d1f51421a556c2a7e4db9cab5f10" gracePeriod=600 Apr 24 21:30:02.299800 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.299615 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="config-reloader" containerID="cri-o://2aaf82e4fc5197238e782988c7365b08b33b15eff95a59b0c8484b2cc4292dac" gracePeriod=600 Apr 24 21:30:02.299800 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.299603 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="thanos-sidecar" containerID="cri-o://86149dc678aa8ae3bb31a4ad289b6f21d9dac0e46e42ae17d1f0243c7d501ba0" gracePeriod=600 Apr 24 21:30:02.374445 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:30:02.374309 2568 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c is running failed: container process not found" containerID="eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Apr 24 21:30:02.374816 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:30:02.374784 2568 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c is running failed: container process not found" containerID="eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Apr 24 21:30:02.375101 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:30:02.375073 2568 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c is running failed: container process not found" containerID="eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Apr 24 21:30:02.375172 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:30:02.375117 2568 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c is running failed: container process not found" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="prometheus" probeResult="unknown" Apr 24 21:30:02.431948 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.431913 2568 generic.go:358] "Generic (PLEG): container finished" podID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerID="9ff1c74ed683f6d13dd49628162ec3104098ecd490bbbcb105b30414d8956a70" exitCode=0 Apr 24 21:30:02.431948 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.431942 2568 generic.go:358] "Generic (PLEG): container finished" podID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerID="2e004e8c489dec782f7d0667d5618d66113d9a8830374a51b209d8e2898f1f25" exitCode=0 Apr 24 21:30:02.431948 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.431949 2568 generic.go:358] "Generic (PLEG): container finished" podID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerID="8b61b11bf960ee0f16f5d579fb66806a0d68d1f51421a556c2a7e4db9cab5f10" exitCode=0 Apr 24 21:30:02.431948 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.431954 2568 generic.go:358] "Generic (PLEG): container finished" podID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerID="86149dc678aa8ae3bb31a4ad289b6f21d9dac0e46e42ae17d1f0243c7d501ba0" exitCode=0 Apr 24 21:30:02.431948 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.431960 2568 generic.go:358] "Generic (PLEG): container finished" podID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerID="2aaf82e4fc5197238e782988c7365b08b33b15eff95a59b0c8484b2cc4292dac" exitCode=0 Apr 24 21:30:02.431948 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.431965 2568 generic.go:358] "Generic (PLEG): container finished" podID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerID="eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c" exitCode=0 Apr 24 21:30:02.432543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.431994 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerDied","Data":"9ff1c74ed683f6d13dd49628162ec3104098ecd490bbbcb105b30414d8956a70"} Apr 24 21:30:02.432543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.432035 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerDied","Data":"2e004e8c489dec782f7d0667d5618d66113d9a8830374a51b209d8e2898f1f25"} Apr 24 21:30:02.432543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.432045 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerDied","Data":"8b61b11bf960ee0f16f5d579fb66806a0d68d1f51421a556c2a7e4db9cab5f10"} Apr 24 21:30:02.432543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.432054 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerDied","Data":"86149dc678aa8ae3bb31a4ad289b6f21d9dac0e46e42ae17d1f0243c7d501ba0"} Apr 24 21:30:02.432543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.432063 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerDied","Data":"2aaf82e4fc5197238e782988c7365b08b33b15eff95a59b0c8484b2cc4292dac"} Apr 24 21:30:02.432543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.432074 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerDied","Data":"eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c"} Apr 24 21:30:02.442550 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.442520 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm"] Apr 24 21:30:02.446065 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:30:02.446040 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab725086_c0f4_4827_ba09_3bcd0a0470e0.slice/crio-857a6ba7d2af46a8321abfa175606f62e9a291239001cab6ce54023e393b8fd3 WatchSource:0}: Error finding container 857a6ba7d2af46a8321abfa175606f62e9a291239001cab6ce54023e393b8fd3: Status 404 returned error can't find the container with id 857a6ba7d2af46a8321abfa175606f62e9a291239001cab6ce54023e393b8fd3 Apr 24 21:30:02.540855 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.540791 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:02.579673 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.579642 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-serving-certs-ca-bundle\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.579828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.579679 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-db\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.579828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.579724 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-kube-rbac-proxy\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.579828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.579753 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-tls\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.579828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.579771 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-kubelet-serving-ca-bundle\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.579828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.579786 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config-out\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.579828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.579800 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-rulefiles-0\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.579828 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.579830 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-tls-assets\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.580185 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580082 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:02.580445 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580355 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.580445 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580402 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580595 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-thanos-prometheus-http-client-file\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580655 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cq98\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-kube-api-access-4cq98\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580680 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-grpc-tls\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580735 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-metrics-client-certs\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580764 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-metrics-client-ca\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580804 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580837 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-web-config\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580876 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.581393 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:02.581750 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.581466 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:02.582284 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.580905 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-trusted-ca-bundle\") pod \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\" (UID: \"917dee55-1b79-41dc-9ff7-27c7e3d3f922\") " Apr 24 21:30:02.582841 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.582434 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.582841 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.582459 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-db\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.582841 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.582468 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:02.582841 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.582478 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.582841 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.582494 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.582841 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.582530 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.583233 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.582905 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:02.583233 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.583157 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:02.583344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.583309 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config-out" (OuterVolumeSpecName: "config-out") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:02.583929 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.583871 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.585389 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.585354 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.585645 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.585619 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.585758 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.585657 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.585758 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.585696 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.586905 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.586875 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.587007 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.586978 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-kube-api-access-4cq98" (OuterVolumeSpecName: "kube-api-access-4cq98") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "kube-api-access-4cq98". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:02.587494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.587473 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config" (OuterVolumeSpecName: "config") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.597777 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.597746 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-web-config" (OuterVolumeSpecName: "web-config") pod "917dee55-1b79-41dc-9ff7-27c7e3d3f922" (UID: "917dee55-1b79-41dc-9ff7-27c7e3d3f922"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:02.683697 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683660 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683697 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683690 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config-out\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683697 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683719 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-tls-assets\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683733 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683747 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683763 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cq98\" (UniqueName: \"kubernetes.io/projected/917dee55-1b79-41dc-9ff7-27c7e3d3f922-kube-api-access-4cq98\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683774 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-grpc-tls\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683786 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-metrics-client-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683799 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-configmap-metrics-client-ca\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683810 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-config\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683823 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-web-config\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683835 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683847 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/917dee55-1b79-41dc-9ff7-27c7e3d3f922-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:02.683967 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:02.683860 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/917dee55-1b79-41dc-9ff7-27c7e3d3f922-secret-kube-rbac-proxy\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:30:03.436819 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.436738 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" event={"ID":"ab725086-c0f4-4827-ba09-3bcd0a0470e0","Type":"ContainerStarted","Data":"857a6ba7d2af46a8321abfa175606f62e9a291239001cab6ce54023e393b8fd3"} Apr 24 21:30:03.439897 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.439867 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"917dee55-1b79-41dc-9ff7-27c7e3d3f922","Type":"ContainerDied","Data":"3b7914c1a4f6c9d788ea18193a96ab2cd686d17446e2b05c20fadfe098c354f4"} Apr 24 21:30:03.440020 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.439916 2568 scope.go:117] "RemoveContainer" containerID="9ff1c74ed683f6d13dd49628162ec3104098ecd490bbbcb105b30414d8956a70" Apr 24 21:30:03.440020 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.439989 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.448204 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.448182 2568 scope.go:117] "RemoveContainer" containerID="2e004e8c489dec782f7d0667d5618d66113d9a8830374a51b209d8e2898f1f25" Apr 24 21:30:03.456115 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.456093 2568 scope.go:117] "RemoveContainer" containerID="8b61b11bf960ee0f16f5d579fb66806a0d68d1f51421a556c2a7e4db9cab5f10" Apr 24 21:30:03.463435 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.463413 2568 scope.go:117] "RemoveContainer" containerID="86149dc678aa8ae3bb31a4ad289b6f21d9dac0e46e42ae17d1f0243c7d501ba0" Apr 24 21:30:03.467545 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.467520 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:03.471772 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.471754 2568 scope.go:117] "RemoveContainer" containerID="2aaf82e4fc5197238e782988c7365b08b33b15eff95a59b0c8484b2cc4292dac" Apr 24 21:30:03.472510 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.472489 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:03.478044 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.478028 2568 scope.go:117] "RemoveContainer" containerID="eb702265797dfde6d1f8709eab4d88c39ad3fc555fdb3f074b50fadeec03a71c" Apr 24 21:30:03.484519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.484504 2568 scope.go:117] "RemoveContainer" containerID="cc6bb060f3f1774eda6376a8bbd0011fe34d8aea2c08ff97704d4720339b34c0" Apr 24 21:30:03.503027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503000 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:03.503542 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503524 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="prometheus" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503545 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="prometheus" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503568 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy-web" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503576 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy-web" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503586 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503596 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503608 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="thanos-sidecar" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503616 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="thanos-sidecar" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503630 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="init-config-reloader" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503637 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="init-config-reloader" Apr 24 21:30:03.503643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503647 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy-thanos" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503655 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy-thanos" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503673 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="config-reloader" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503682 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="config-reloader" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503770 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy-thanos" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503784 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="thanos-sidecar" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503795 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="config-reloader" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503805 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy-web" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503815 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="prometheus" Apr 24 21:30:03.504110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.503826 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" containerName="kube-rbac-proxy" Apr 24 21:30:03.508219 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.508199 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.511103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.511076 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-6p4cm\"" Apr 24 21:30:03.511256 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.511121 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:30:03.511382 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.511260 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:30:03.511382 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.511272 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:30:03.511382 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.511271 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:30:03.511619 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.511568 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:30:03.511619 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.511570 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:30:03.511849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.511829 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3omq6bovccuc0\"" Apr 24 21:30:03.512120 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.512029 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:30:03.512213 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.512128 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:30:03.512213 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.512177 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:30:03.512323 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.512222 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:30:03.514448 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.514385 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:30:03.518107 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.518088 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:30:03.522023 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.522004 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:03.589889 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.589856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z928\" (UniqueName: \"kubernetes.io/projected/cd913094-9ed8-4431-8f8f-1b6220e14c55-kube-api-access-7z928\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.589889 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.589892 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590073 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.589910 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590073 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.589959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd913094-9ed8-4431-8f8f-1b6220e14c55-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590073 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590014 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-config\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590073 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590208 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590085 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590208 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590103 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590208 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590130 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590208 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590151 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590208 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590174 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590208 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590191 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590395 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590214 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590395 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590395 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590280 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590395 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590295 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590395 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.590395 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.590377 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd913094-9ed8-4431-8f8f-1b6220e14c55-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.691797 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.691797 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691763 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.691797 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691788 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd913094-9ed8-4431-8f8f-1b6220e14c55-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z928\" (UniqueName: \"kubernetes.io/projected/cd913094-9ed8-4431-8f8f-1b6220e14c55-kube-api-access-7z928\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691905 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691969 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd913094-9ed8-4431-8f8f-1b6220e14c55-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.691997 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-config\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692021 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692195 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692214 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692239 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692796 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692531 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.692853 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.692824 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.693651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.695338 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.695475 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.696568 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-config\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.696943 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.696978 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.697040 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.697104 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd913094-9ed8-4431-8f8f-1b6220e14c55-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.697320 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd913094-9ed8-4431-8f8f-1b6220e14c55-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.697459 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.697516 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.697666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.697626 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.698622 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.698600 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd913094-9ed8-4431-8f8f-1b6220e14c55-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.698788 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.698769 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.699027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.699005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd913094-9ed8-4431-8f8f-1b6220e14c55-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.702031 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.702011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z928\" (UniqueName: \"kubernetes.io/projected/cd913094-9ed8-4431-8f8f-1b6220e14c55-kube-api-access-7z928\") pod \"prometheus-k8s-0\" (UID: \"cd913094-9ed8-4431-8f8f-1b6220e14c55\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.821753 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.821696 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:03.917983 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.917947 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="917dee55-1b79-41dc-9ff7-27c7e3d3f922" path="/var/lib/kubelet/pods/917dee55-1b79-41dc-9ff7-27c7e3d3f922/volumes" Apr 24 21:30:03.969645 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:03.969570 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:30:04.276986 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:30:04.276942 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd913094_9ed8_4431_8f8f_1b6220e14c55.slice/crio-84f8bf9e09c3249a0ac4485ec54a81088d375c66d6a3fcfbeee1895b74c9daf1 WatchSource:0}: Error finding container 84f8bf9e09c3249a0ac4485ec54a81088d375c66d6a3fcfbeee1895b74c9daf1: Status 404 returned error can't find the container with id 84f8bf9e09c3249a0ac4485ec54a81088d375c66d6a3fcfbeee1895b74c9daf1 Apr 24 21:30:04.444840 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:04.444803 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" event={"ID":"ab725086-c0f4-4827-ba09-3bcd0a0470e0","Type":"ContainerStarted","Data":"80613ff580b2d2474fbc9613fc1c29e4843898999b699849bac64123b6c66c3f"} Apr 24 21:30:04.446118 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:04.446092 2568 generic.go:358] "Generic (PLEG): container finished" podID="cd913094-9ed8-4431-8f8f-1b6220e14c55" containerID="b1a9c3cc79b721aa52ae4ec38de70290f973eb3cce511e0d4077552aa5506414" exitCode=0 Apr 24 21:30:04.446240 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:04.446180 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd913094-9ed8-4431-8f8f-1b6220e14c55","Type":"ContainerDied","Data":"b1a9c3cc79b721aa52ae4ec38de70290f973eb3cce511e0d4077552aa5506414"} Apr 24 21:30:04.446240 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:04.446220 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd913094-9ed8-4431-8f8f-1b6220e14c55","Type":"ContainerStarted","Data":"84f8bf9e09c3249a0ac4485ec54a81088d375c66d6a3fcfbeee1895b74c9daf1"} Apr 24 21:30:05.454006 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.453964 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd913094-9ed8-4431-8f8f-1b6220e14c55","Type":"ContainerStarted","Data":"4a5128fd6ca0768409aa19b1a429b2f27cb161ef3d24f9d5793af3ded22577ae"} Apr 24 21:30:05.454006 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.454007 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd913094-9ed8-4431-8f8f-1b6220e14c55","Type":"ContainerStarted","Data":"377f42f96d0274de283135a576a8fe9d0eaddba89befd390e3dc6d101170e889"} Apr 24 21:30:05.454478 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.454019 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd913094-9ed8-4431-8f8f-1b6220e14c55","Type":"ContainerStarted","Data":"e915366bec50244edbf0b43e95fcbb36b898f7c036bf51b82870159059e1f903"} Apr 24 21:30:05.454478 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.454032 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd913094-9ed8-4431-8f8f-1b6220e14c55","Type":"ContainerStarted","Data":"b1407bbf27001fff002d8a88947f7d6d6cecaf9c1e4e797dbbbc1434259dbddc"} Apr 24 21:30:05.454478 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.454045 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd913094-9ed8-4431-8f8f-1b6220e14c55","Type":"ContainerStarted","Data":"efbee37f0a3f29940133e3aeb2bdfec2bfce709d767046b961767f21c4c5f53c"} Apr 24 21:30:05.454478 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.454058 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd913094-9ed8-4431-8f8f-1b6220e14c55","Type":"ContainerStarted","Data":"2b56b323f3cb284a3a09d44ecfca21b7bff5d74b8f72073c49af316f782037e1"} Apr 24 21:30:05.455641 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.455617 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" event={"ID":"ab725086-c0f4-4827-ba09-3bcd0a0470e0","Type":"ContainerStarted","Data":"60295030925b007469f5c1bd03aeee93342d19057e2ae2e9ecfbacede5b72238"} Apr 24 21:30:05.455738 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.455644 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" event={"ID":"ab725086-c0f4-4827-ba09-3bcd0a0470e0","Type":"ContainerStarted","Data":"e3f05f590981cf78c2a85f0ef078c9405e4bc271e94197dc825cbfab59097b32"} Apr 24 21:30:05.496011 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.495961 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.495945968 podStartE2EDuration="2.495945968s" podCreationTimestamp="2026-04-24 21:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:05.49503414 +0000 UTC m=+130.098102922" watchObservedRunningTime="2026-04-24 21:30:05.495945968 +0000 UTC m=+130.099014748" Apr 24 21:30:05.520509 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:05.520458 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6bbc6c965b-tzqbm" podStartSLOduration=2.657846343 podStartE2EDuration="4.520443675s" podCreationTimestamp="2026-04-24 21:30:01 +0000 UTC" firstStartedPulling="2026-04-24 21:30:02.447819185 +0000 UTC m=+127.050887948" lastFinishedPulling="2026-04-24 21:30:04.310416508 +0000 UTC m=+128.913485280" observedRunningTime="2026-04-24 21:30:05.519089763 +0000 UTC m=+130.122158581" watchObservedRunningTime="2026-04-24 21:30:05.520443675 +0000 UTC m=+130.123512455" Apr 24 21:30:08.822490 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:30:08.822456 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.822583 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:31:03.822530 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:03.837647 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:31:03.837622 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:04.640424 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:31:04.640398 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:06.593061 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.593025 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pblbn"] Apr 24 21:32:06.596120 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.596103 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.603364 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.603345 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:32:06.609269 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.609247 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pblbn"] Apr 24 21:32:06.660308 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.660280 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2e1fa258-3349-404c-99a9-780be75b2a17-original-pull-secret\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.660409 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.660333 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2e1fa258-3349-404c-99a9-780be75b2a17-dbus\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.660409 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.660378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2e1fa258-3349-404c-99a9-780be75b2a17-kubelet-config\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.760834 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.760813 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2e1fa258-3349-404c-99a9-780be75b2a17-kubelet-config\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.760956 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.760859 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2e1fa258-3349-404c-99a9-780be75b2a17-original-pull-secret\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.760956 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.760894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2e1fa258-3349-404c-99a9-780be75b2a17-dbus\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.761041 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.760944 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2e1fa258-3349-404c-99a9-780be75b2a17-kubelet-config\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.761041 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.761021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2e1fa258-3349-404c-99a9-780be75b2a17-dbus\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.763047 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.763028 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2e1fa258-3349-404c-99a9-780be75b2a17-original-pull-secret\") pod \"global-pull-secret-syncer-pblbn\" (UID: \"2e1fa258-3349-404c-99a9-780be75b2a17\") " pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:06.905400 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:06.905335 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pblbn" Apr 24 21:32:07.017867 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:07.017835 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pblbn"] Apr 24 21:32:07.021004 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:32:07.020974 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e1fa258_3349_404c_99a9_780be75b2a17.slice/crio-916bc912a58ffe5c121beec70601ce07974ab64fe3e703ad05dcfb9e411c0bc9 WatchSource:0}: Error finding container 916bc912a58ffe5c121beec70601ce07974ab64fe3e703ad05dcfb9e411c0bc9: Status 404 returned error can't find the container with id 916bc912a58ffe5c121beec70601ce07974ab64fe3e703ad05dcfb9e411c0bc9 Apr 24 21:32:07.809126 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:07.809084 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pblbn" event={"ID":"2e1fa258-3349-404c-99a9-780be75b2a17","Type":"ContainerStarted","Data":"916bc912a58ffe5c121beec70601ce07974ab64fe3e703ad05dcfb9e411c0bc9"} Apr 24 21:32:10.820109 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:10.820056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pblbn" event={"ID":"2e1fa258-3349-404c-99a9-780be75b2a17","Type":"ContainerStarted","Data":"9c071115b6a2c60166774f584238f7cb0b805feaf7f25eace8ad0760374aee39"} Apr 24 21:32:10.842862 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:10.842812 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pblbn" podStartSLOduration=1.214313625 podStartE2EDuration="4.842797799s" podCreationTimestamp="2026-04-24 21:32:06 +0000 UTC" firstStartedPulling="2026-04-24 21:32:07.022613338 +0000 UTC m=+251.625682114" lastFinishedPulling="2026-04-24 21:32:10.651097526 +0000 UTC m=+255.254166288" observedRunningTime="2026-04-24 21:32:10.839560888 +0000 UTC m=+255.442629661" watchObservedRunningTime="2026-04-24 21:32:10.842797799 +0000 UTC m=+255.445866579" Apr 24 21:32:55.801485 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:55.801452 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:32:55.802656 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:55.802637 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:32:55.807626 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:32:55.807605 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:34:04.598371 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.598332 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-d9vs2"] Apr 24 21:34:04.600784 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.600768 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:04.605348 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.605323 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 24 21:34:04.606040 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.606016 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-mtdsl\"" Apr 24 21:34:04.606162 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.606016 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 24 21:34:04.647068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.647039 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-d9vs2"] Apr 24 21:34:04.791497 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.791465 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a338495-15cf-4037-ba0f-dc4621bdb6fa-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-d9vs2\" (UID: \"5a338495-15cf-4037-ba0f-dc4621bdb6fa\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:04.791497 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.791497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwdj\" (UniqueName: \"kubernetes.io/projected/5a338495-15cf-4037-ba0f-dc4621bdb6fa-kube-api-access-7rwdj\") pod \"cert-manager-webhook-587ccfb98-d9vs2\" (UID: \"5a338495-15cf-4037-ba0f-dc4621bdb6fa\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:04.892007 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.891931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a338495-15cf-4037-ba0f-dc4621bdb6fa-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-d9vs2\" (UID: \"5a338495-15cf-4037-ba0f-dc4621bdb6fa\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:04.892007 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.891965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwdj\" (UniqueName: \"kubernetes.io/projected/5a338495-15cf-4037-ba0f-dc4621bdb6fa-kube-api-access-7rwdj\") pod \"cert-manager-webhook-587ccfb98-d9vs2\" (UID: \"5a338495-15cf-4037-ba0f-dc4621bdb6fa\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:04.910702 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.910672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwdj\" (UniqueName: \"kubernetes.io/projected/5a338495-15cf-4037-ba0f-dc4621bdb6fa-kube-api-access-7rwdj\") pod \"cert-manager-webhook-587ccfb98-d9vs2\" (UID: \"5a338495-15cf-4037-ba0f-dc4621bdb6fa\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:04.911036 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.911018 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a338495-15cf-4037-ba0f-dc4621bdb6fa-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-d9vs2\" (UID: \"5a338495-15cf-4037-ba0f-dc4621bdb6fa\") " pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:04.921254 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:04.921235 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:05.069664 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:05.069629 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-d9vs2"] Apr 24 21:34:05.073758 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:34:05.073731 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a338495_15cf_4037_ba0f_dc4621bdb6fa.slice/crio-7f158debc4cce46bbacc7f1ce5908db522f457bec4a42f0cafebddfab3818e4a WatchSource:0}: Error finding container 7f158debc4cce46bbacc7f1ce5908db522f457bec4a42f0cafebddfab3818e4a: Status 404 returned error can't find the container with id 7f158debc4cce46bbacc7f1ce5908db522f457bec4a42f0cafebddfab3818e4a Apr 24 21:34:05.075467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:05.075452 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:34:05.139107 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:05.139079 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" event={"ID":"5a338495-15cf-4037-ba0f-dc4621bdb6fa","Type":"ContainerStarted","Data":"7f158debc4cce46bbacc7f1ce5908db522f457bec4a42f0cafebddfab3818e4a"} Apr 24 21:34:06.713051 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.713012 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-7pjqv"] Apr 24 21:34:06.718467 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.718445 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" Apr 24 21:34:06.727504 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.726790 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-rrhht\"" Apr 24 21:34:06.747288 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.747261 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-7pjqv"] Apr 24 21:34:06.810150 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.810123 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a55e480-4089-47bb-b2bd-54675fa1073e-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-7pjqv\" (UID: \"2a55e480-4089-47bb-b2bd-54675fa1073e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" Apr 24 21:34:06.810291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.810172 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz225\" (UniqueName: \"kubernetes.io/projected/2a55e480-4089-47bb-b2bd-54675fa1073e-kube-api-access-rz225\") pod \"cert-manager-cainjector-68b757865b-7pjqv\" (UID: \"2a55e480-4089-47bb-b2bd-54675fa1073e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" Apr 24 21:34:06.911349 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.911269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a55e480-4089-47bb-b2bd-54675fa1073e-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-7pjqv\" (UID: \"2a55e480-4089-47bb-b2bd-54675fa1073e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" Apr 24 21:34:06.911349 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.911336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz225\" (UniqueName: \"kubernetes.io/projected/2a55e480-4089-47bb-b2bd-54675fa1073e-kube-api-access-rz225\") pod \"cert-manager-cainjector-68b757865b-7pjqv\" (UID: \"2a55e480-4089-47bb-b2bd-54675fa1073e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" Apr 24 21:34:06.926931 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.926901 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a55e480-4089-47bb-b2bd-54675fa1073e-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-7pjqv\" (UID: \"2a55e480-4089-47bb-b2bd-54675fa1073e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" Apr 24 21:34:06.927076 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:06.926932 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz225\" (UniqueName: \"kubernetes.io/projected/2a55e480-4089-47bb-b2bd-54675fa1073e-kube-api-access-rz225\") pod \"cert-manager-cainjector-68b757865b-7pjqv\" (UID: \"2a55e480-4089-47bb-b2bd-54675fa1073e\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" Apr 24 21:34:07.030063 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:07.030023 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" Apr 24 21:34:07.409213 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:07.409179 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-7pjqv"] Apr 24 21:34:08.204128 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:34:08.204093 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a55e480_4089_47bb_b2bd_54675fa1073e.slice/crio-abbe99e213fe8c945e85b1ebb6c79fad357bdb5ee28c961661cdf4a969175df4 WatchSource:0}: Error finding container abbe99e213fe8c945e85b1ebb6c79fad357bdb5ee28c961661cdf4a969175df4: Status 404 returned error can't find the container with id abbe99e213fe8c945e85b1ebb6c79fad357bdb5ee28c961661cdf4a969175df4 Apr 24 21:34:09.154622 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:09.154578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" event={"ID":"5a338495-15cf-4037-ba0f-dc4621bdb6fa","Type":"ContainerStarted","Data":"397a530f9fa2969414f3cb9ab1737daced384a15e8a9625c108c2ef042a93c05"} Apr 24 21:34:09.154834 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:09.154669 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:09.155905 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:09.155882 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" event={"ID":"2a55e480-4089-47bb-b2bd-54675fa1073e","Type":"ContainerStarted","Data":"3bb21c977eae95b8cca2a53b448113e88afaf9ce01a9105f9fc2c43391ecb9be"} Apr 24 21:34:09.155905 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:09.155907 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" event={"ID":"2a55e480-4089-47bb-b2bd-54675fa1073e","Type":"ContainerStarted","Data":"abbe99e213fe8c945e85b1ebb6c79fad357bdb5ee28c961661cdf4a969175df4"} Apr 24 21:34:09.184439 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:09.184401 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" podStartSLOduration=2.021887896 podStartE2EDuration="5.184390554s" podCreationTimestamp="2026-04-24 21:34:04 +0000 UTC" firstStartedPulling="2026-04-24 21:34:05.075596629 +0000 UTC m=+369.678665390" lastFinishedPulling="2026-04-24 21:34:08.238099281 +0000 UTC m=+372.841168048" observedRunningTime="2026-04-24 21:34:09.184033689 +0000 UTC m=+373.787102472" watchObservedRunningTime="2026-04-24 21:34:09.184390554 +0000 UTC m=+373.787459335" Apr 24 21:34:09.214535 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:09.214499 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-7pjqv" podStartSLOduration=2.721290151 podStartE2EDuration="3.214487637s" podCreationTimestamp="2026-04-24 21:34:06 +0000 UTC" firstStartedPulling="2026-04-24 21:34:08.206153579 +0000 UTC m=+372.809222354" lastFinishedPulling="2026-04-24 21:34:08.69935107 +0000 UTC m=+373.302419840" observedRunningTime="2026-04-24 21:34:09.214052708 +0000 UTC m=+373.817121504" watchObservedRunningTime="2026-04-24 21:34:09.214487637 +0000 UTC m=+373.817556418" Apr 24 21:34:13.524850 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.524818 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-nhtmh"] Apr 24 21:34:13.527200 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.527181 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-nhtmh" Apr 24 21:34:13.530163 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.530148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-42s74\"" Apr 24 21:34:13.544339 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.544318 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-nhtmh"] Apr 24 21:34:13.564196 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.564177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d1df480-dacf-4154-bdc0-ce520c7269dd-bound-sa-token\") pod \"cert-manager-79c8d999ff-nhtmh\" (UID: \"9d1df480-dacf-4154-bdc0-ce520c7269dd\") " pod="cert-manager/cert-manager-79c8d999ff-nhtmh" Apr 24 21:34:13.564295 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.564266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6hcw\" (UniqueName: \"kubernetes.io/projected/9d1df480-dacf-4154-bdc0-ce520c7269dd-kube-api-access-j6hcw\") pod \"cert-manager-79c8d999ff-nhtmh\" (UID: \"9d1df480-dacf-4154-bdc0-ce520c7269dd\") " pod="cert-manager/cert-manager-79c8d999ff-nhtmh" Apr 24 21:34:13.665261 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.665237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6hcw\" (UniqueName: \"kubernetes.io/projected/9d1df480-dacf-4154-bdc0-ce520c7269dd-kube-api-access-j6hcw\") pod \"cert-manager-79c8d999ff-nhtmh\" (UID: \"9d1df480-dacf-4154-bdc0-ce520c7269dd\") " pod="cert-manager/cert-manager-79c8d999ff-nhtmh" Apr 24 21:34:13.665401 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.665279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d1df480-dacf-4154-bdc0-ce520c7269dd-bound-sa-token\") pod \"cert-manager-79c8d999ff-nhtmh\" (UID: \"9d1df480-dacf-4154-bdc0-ce520c7269dd\") " pod="cert-manager/cert-manager-79c8d999ff-nhtmh" Apr 24 21:34:13.677013 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.676992 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6hcw\" (UniqueName: \"kubernetes.io/projected/9d1df480-dacf-4154-bdc0-ce520c7269dd-kube-api-access-j6hcw\") pod \"cert-manager-79c8d999ff-nhtmh\" (UID: \"9d1df480-dacf-4154-bdc0-ce520c7269dd\") " pod="cert-manager/cert-manager-79c8d999ff-nhtmh" Apr 24 21:34:13.677188 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.677171 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d1df480-dacf-4154-bdc0-ce520c7269dd-bound-sa-token\") pod \"cert-manager-79c8d999ff-nhtmh\" (UID: \"9d1df480-dacf-4154-bdc0-ce520c7269dd\") " pod="cert-manager/cert-manager-79c8d999ff-nhtmh" Apr 24 21:34:13.836453 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.836376 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-nhtmh" Apr 24 21:34:13.961570 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:13.961536 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-nhtmh"] Apr 24 21:34:13.964549 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:34:13.964519 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d1df480_dacf_4154_bdc0_ce520c7269dd.slice/crio-c9f94eb87619d62ed38e5fc06bf9e8f08e4c49347a1112d7a90ef9faa6bbee83 WatchSource:0}: Error finding container c9f94eb87619d62ed38e5fc06bf9e8f08e4c49347a1112d7a90ef9faa6bbee83: Status 404 returned error can't find the container with id c9f94eb87619d62ed38e5fc06bf9e8f08e4c49347a1112d7a90ef9faa6bbee83 Apr 24 21:34:14.169811 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:14.169735 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-nhtmh" event={"ID":"9d1df480-dacf-4154-bdc0-ce520c7269dd","Type":"ContainerStarted","Data":"e0d5e1d2f10af396b8f5403157d1148e9c2544bb763797292ef0e18691a5208a"} Apr 24 21:34:14.169811 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:14.169768 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-nhtmh" event={"ID":"9d1df480-dacf-4154-bdc0-ce520c7269dd","Type":"ContainerStarted","Data":"c9f94eb87619d62ed38e5fc06bf9e8f08e4c49347a1112d7a90ef9faa6bbee83"} Apr 24 21:34:14.195604 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:14.195559 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-nhtmh" podStartSLOduration=1.195549071 podStartE2EDuration="1.195549071s" podCreationTimestamp="2026-04-24 21:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:34:14.194985373 +0000 UTC m=+378.798054153" watchObservedRunningTime="2026-04-24 21:34:14.195549071 +0000 UTC m=+378.798617852" Apr 24 21:34:15.160693 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:15.160661 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-d9vs2" Apr 24 21:34:49.503104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.503066 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l"] Apr 24 21:34:49.507382 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.507355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.509954 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.509933 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 21:34:49.510184 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.510168 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 21:34:49.511314 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.511291 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 21:34:49.511930 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.511914 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 21:34:49.512118 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.512030 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:34:49.512181 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.512103 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-b9ct4\"" Apr 24 21:34:49.546589 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.546563 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l"] Apr 24 21:34:49.668108 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.668074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-manager-config\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.668277 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.668124 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-cert\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.668277 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.668226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-metrics-cert\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.668277 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.668258 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8x9w\" (UniqueName: \"kubernetes.io/projected/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-kube-api-access-x8x9w\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.769557 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.769483 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-metrics-cert\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.769557 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.769524 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8x9w\" (UniqueName: \"kubernetes.io/projected/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-kube-api-access-x8x9w\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.769779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.769560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-manager-config\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.769779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.769598 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-cert\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.770243 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.770222 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-manager-config\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.771929 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.771908 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-metrics-cert\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.772028 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.771972 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-cert\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.790331 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.790303 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8x9w\" (UniqueName: \"kubernetes.io/projected/dedbecbe-3e87-4e0c-b877-3e8d7b56d8db-kube-api-access-x8x9w\") pod \"lws-controller-manager-5774f66dc9-rt46l\" (UID: \"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.817148 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.817127 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:49.971202 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:49.965356 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l"] Apr 24 21:34:49.976056 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:34:49.976024 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddedbecbe_3e87_4e0c_b877_3e8d7b56d8db.slice/crio-5e4865c4991fb812dd2125cf7791976554ff06c825c2e6db3b69403588dba9fb WatchSource:0}: Error finding container 5e4865c4991fb812dd2125cf7791976554ff06c825c2e6db3b69403588dba9fb: Status 404 returned error can't find the container with id 5e4865c4991fb812dd2125cf7791976554ff06c825c2e6db3b69403588dba9fb Apr 24 21:34:50.271183 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:50.271146 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" event={"ID":"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db","Type":"ContainerStarted","Data":"5e4865c4991fb812dd2125cf7791976554ff06c825c2e6db3b69403588dba9fb"} Apr 24 21:34:53.281482 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:53.281448 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" event={"ID":"dedbecbe-3e87-4e0c-b877-3e8d7b56d8db","Type":"ContainerStarted","Data":"9756a5319e058ae45d9a62e82ec7550b5f0cd971346dabff7125f0873e968d0e"} Apr 24 21:34:53.281857 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:53.281528 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:34:53.318836 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:34:53.318781 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" podStartSLOduration=1.97541173 podStartE2EDuration="4.31876273s" podCreationTimestamp="2026-04-24 21:34:49 +0000 UTC" firstStartedPulling="2026-04-24 21:34:49.978130401 +0000 UTC m=+414.581199165" lastFinishedPulling="2026-04-24 21:34:52.321481403 +0000 UTC m=+416.924550165" observedRunningTime="2026-04-24 21:34:53.31709134 +0000 UTC m=+417.920160120" watchObservedRunningTime="2026-04-24 21:34:53.31876273 +0000 UTC m=+417.921831511" Apr 24 21:35:04.286854 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:35:04.286817 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-rt46l" Apr 24 21:36:08.528679 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.528643 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4d9ct"] Apr 24 21:36:08.532087 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.532072 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" Apr 24 21:36:08.534388 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.534360 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 21:36:08.534512 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.534449 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-wkdcg\"" Apr 24 21:36:08.535068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.535054 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 21:36:08.547460 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.547439 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4d9ct"] Apr 24 21:36:08.640838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.640801 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbw6n\" (UniqueName: \"kubernetes.io/projected/dbc35bae-0b55-443b-a605-04ac2db9b369-kube-api-access-wbw6n\") pod \"authorino-operator-7587b89b76-4d9ct\" (UID: \"dbc35bae-0b55-443b-a605-04ac2db9b369\") " pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" Apr 24 21:36:08.741985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.741957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbw6n\" (UniqueName: \"kubernetes.io/projected/dbc35bae-0b55-443b-a605-04ac2db9b369-kube-api-access-wbw6n\") pod \"authorino-operator-7587b89b76-4d9ct\" (UID: \"dbc35bae-0b55-443b-a605-04ac2db9b369\") " pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" Apr 24 21:36:08.751936 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.751906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbw6n\" (UniqueName: \"kubernetes.io/projected/dbc35bae-0b55-443b-a605-04ac2db9b369-kube-api-access-wbw6n\") pod \"authorino-operator-7587b89b76-4d9ct\" (UID: \"dbc35bae-0b55-443b-a605-04ac2db9b369\") " pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" Apr 24 21:36:08.842603 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.842519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" Apr 24 21:36:08.975678 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:08.975653 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4d9ct"] Apr 24 21:36:08.978082 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:36:08.978056 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbc35bae_0b55_443b_a605_04ac2db9b369.slice/crio-9dee81a6f0901c8a98b8f1349cd348aa69042dd142abddc1da6c859b74b642e8 WatchSource:0}: Error finding container 9dee81a6f0901c8a98b8f1349cd348aa69042dd142abddc1da6c859b74b642e8: Status 404 returned error can't find the container with id 9dee81a6f0901c8a98b8f1349cd348aa69042dd142abddc1da6c859b74b642e8 Apr 24 21:36:09.509891 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:09.509855 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" event={"ID":"dbc35bae-0b55-443b-a605-04ac2db9b369","Type":"ContainerStarted","Data":"9dee81a6f0901c8a98b8f1349cd348aa69042dd142abddc1da6c859b74b642e8"} Apr 24 21:36:12.520456 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:12.520421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" event={"ID":"dbc35bae-0b55-443b-a605-04ac2db9b369","Type":"ContainerStarted","Data":"85ce8edc8ccd984697e385c48d13a584ce80e7927242c3ee9fa53b01643cde8c"} Apr 24 21:36:12.520830 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:12.520566 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" Apr 24 21:36:12.537779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:12.537730 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" podStartSLOduration=1.810257371 podStartE2EDuration="4.537718789s" podCreationTimestamp="2026-04-24 21:36:08 +0000 UTC" firstStartedPulling="2026-04-24 21:36:08.980504251 +0000 UTC m=+493.583573013" lastFinishedPulling="2026-04-24 21:36:11.707965658 +0000 UTC m=+496.311034431" observedRunningTime="2026-04-24 21:36:12.536814817 +0000 UTC m=+497.139883597" watchObservedRunningTime="2026-04-24 21:36:12.537718789 +0000 UTC m=+497.140787562" Apr 24 21:36:23.527588 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:36:23.527550 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-4d9ct" Apr 24 21:37:02.771484 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.771442 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-z4h84"] Apr 24 21:37:02.774989 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.774968 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:02.777151 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.777127 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 21:37:02.777249 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.777192 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-99gpq\"" Apr 24 21:37:02.785030 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.785011 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-z4h84"] Apr 24 21:37:02.873450 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.873419 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-z4h84"] Apr 24 21:37:02.888531 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.888497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqbw8\" (UniqueName: \"kubernetes.io/projected/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-kube-api-access-bqbw8\") pod \"limitador-limitador-64c8f475fb-z4h84\" (UID: \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:02.888694 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.888550 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-config-file\") pod \"limitador-limitador-64c8f475fb-z4h84\" (UID: \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:02.989286 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.989245 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqbw8\" (UniqueName: \"kubernetes.io/projected/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-kube-api-access-bqbw8\") pod \"limitador-limitador-64c8f475fb-z4h84\" (UID: \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:02.989471 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.989323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-config-file\") pod \"limitador-limitador-64c8f475fb-z4h84\" (UID: \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:02.989970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.989950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-config-file\") pod \"limitador-limitador-64c8f475fb-z4h84\" (UID: \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:02.998046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:02.998017 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqbw8\" (UniqueName: \"kubernetes.io/projected/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-kube-api-access-bqbw8\") pod \"limitador-limitador-64c8f475fb-z4h84\" (UID: \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:03.084927 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.084832 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:03.203928 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.203903 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-z4h84"] Apr 24 21:37:03.206477 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:37:03.206448 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd73ef5f5_dd34_4238_bcb9_2a09b7168ab7.slice/crio-d4b8f63c27fc27ceec2215e48ce531abc40be8147de9b65bf650f6e25efaeff2 WatchSource:0}: Error finding container d4b8f63c27fc27ceec2215e48ce531abc40be8147de9b65bf650f6e25efaeff2: Status 404 returned error can't find the container with id d4b8f63c27fc27ceec2215e48ce531abc40be8147de9b65bf650f6e25efaeff2 Apr 24 21:37:03.646689 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.646660 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-ckdb9"] Apr 24 21:37:03.656142 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.655514 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ckdb9" Apr 24 21:37:03.656743 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.656698 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ckdb9"] Apr 24 21:37:03.658649 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.658617 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-v2n49\"" Apr 24 21:37:03.685076 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.685042 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" event={"ID":"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7","Type":"ContainerStarted","Data":"d4b8f63c27fc27ceec2215e48ce531abc40be8147de9b65bf650f6e25efaeff2"} Apr 24 21:37:03.797635 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.797596 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvqmp\" (UniqueName: \"kubernetes.io/projected/456eaa70-cc54-4488-a716-99e6f4946326-kube-api-access-lvqmp\") pod \"authorino-674b59b84c-ckdb9\" (UID: \"456eaa70-cc54-4488-a716-99e6f4946326\") " pod="kuadrant-system/authorino-674b59b84c-ckdb9" Apr 24 21:37:03.898158 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.898081 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvqmp\" (UniqueName: \"kubernetes.io/projected/456eaa70-cc54-4488-a716-99e6f4946326-kube-api-access-lvqmp\") pod \"authorino-674b59b84c-ckdb9\" (UID: \"456eaa70-cc54-4488-a716-99e6f4946326\") " pod="kuadrant-system/authorino-674b59b84c-ckdb9" Apr 24 21:37:03.907316 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.907284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvqmp\" (UniqueName: \"kubernetes.io/projected/456eaa70-cc54-4488-a716-99e6f4946326-kube-api-access-lvqmp\") pod \"authorino-674b59b84c-ckdb9\" (UID: \"456eaa70-cc54-4488-a716-99e6f4946326\") " pod="kuadrant-system/authorino-674b59b84c-ckdb9" Apr 24 21:37:03.968017 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:03.967988 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ckdb9" Apr 24 21:37:04.084004 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:04.083974 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ckdb9"] Apr 24 21:37:04.087073 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:37:04.087023 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod456eaa70_cc54_4488_a716_99e6f4946326.slice/crio-276148efbbfd72f7ea548eac57094216b72d281ff44848b45b84a9132f659984 WatchSource:0}: Error finding container 276148efbbfd72f7ea548eac57094216b72d281ff44848b45b84a9132f659984: Status 404 returned error can't find the container with id 276148efbbfd72f7ea548eac57094216b72d281ff44848b45b84a9132f659984 Apr 24 21:37:04.691027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:04.690992 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ckdb9" event={"ID":"456eaa70-cc54-4488-a716-99e6f4946326","Type":"ContainerStarted","Data":"276148efbbfd72f7ea548eac57094216b72d281ff44848b45b84a9132f659984"} Apr 24 21:37:07.773562 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:07.773519 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ckdb9"] Apr 24 21:37:08.707663 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:08.707578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" event={"ID":"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7","Type":"ContainerStarted","Data":"124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c"} Apr 24 21:37:08.707663 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:08.707647 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:08.708982 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:08.708957 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ckdb9" event={"ID":"456eaa70-cc54-4488-a716-99e6f4946326","Type":"ContainerStarted","Data":"94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4"} Apr 24 21:37:08.709082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:08.708997 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-ckdb9" podUID="456eaa70-cc54-4488-a716-99e6f4946326" containerName="authorino" containerID="cri-o://94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4" gracePeriod=30 Apr 24 21:37:08.726080 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:08.726031 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" podStartSLOduration=1.461944484 podStartE2EDuration="6.726017182s" podCreationTimestamp="2026-04-24 21:37:02 +0000 UTC" firstStartedPulling="2026-04-24 21:37:03.208230047 +0000 UTC m=+547.811298806" lastFinishedPulling="2026-04-24 21:37:08.47230274 +0000 UTC m=+553.075371504" observedRunningTime="2026-04-24 21:37:08.725486289 +0000 UTC m=+553.328555067" watchObservedRunningTime="2026-04-24 21:37:08.726017182 +0000 UTC m=+553.329085964" Apr 24 21:37:08.740901 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:08.740862 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-ckdb9" podStartSLOduration=1.409945512 podStartE2EDuration="5.740850786s" podCreationTimestamp="2026-04-24 21:37:03 +0000 UTC" firstStartedPulling="2026-04-24 21:37:04.09031835 +0000 UTC m=+548.693387110" lastFinishedPulling="2026-04-24 21:37:08.421223621 +0000 UTC m=+553.024292384" observedRunningTime="2026-04-24 21:37:08.739003479 +0000 UTC m=+553.342072252" watchObservedRunningTime="2026-04-24 21:37:08.740850786 +0000 UTC m=+553.343919586" Apr 24 21:37:08.943218 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:08.943194 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ckdb9" Apr 24 21:37:09.045396 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.045372 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvqmp\" (UniqueName: \"kubernetes.io/projected/456eaa70-cc54-4488-a716-99e6f4946326-kube-api-access-lvqmp\") pod \"456eaa70-cc54-4488-a716-99e6f4946326\" (UID: \"456eaa70-cc54-4488-a716-99e6f4946326\") " Apr 24 21:37:09.047419 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.047398 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/456eaa70-cc54-4488-a716-99e6f4946326-kube-api-access-lvqmp" (OuterVolumeSpecName: "kube-api-access-lvqmp") pod "456eaa70-cc54-4488-a716-99e6f4946326" (UID: "456eaa70-cc54-4488-a716-99e6f4946326"). InnerVolumeSpecName "kube-api-access-lvqmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:09.146600 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.146569 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvqmp\" (UniqueName: \"kubernetes.io/projected/456eaa70-cc54-4488-a716-99e6f4946326-kube-api-access-lvqmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:37:09.713207 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.713175 2568 generic.go:358] "Generic (PLEG): container finished" podID="456eaa70-cc54-4488-a716-99e6f4946326" containerID="94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4" exitCode=0 Apr 24 21:37:09.713386 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.713229 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ckdb9" Apr 24 21:37:09.713386 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.713257 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ckdb9" event={"ID":"456eaa70-cc54-4488-a716-99e6f4946326","Type":"ContainerDied","Data":"94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4"} Apr 24 21:37:09.713386 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.713292 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ckdb9" event={"ID":"456eaa70-cc54-4488-a716-99e6f4946326","Type":"ContainerDied","Data":"276148efbbfd72f7ea548eac57094216b72d281ff44848b45b84a9132f659984"} Apr 24 21:37:09.713386 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.713307 2568 scope.go:117] "RemoveContainer" containerID="94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4" Apr 24 21:37:09.721320 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.721304 2568 scope.go:117] "RemoveContainer" containerID="94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4" Apr 24 21:37:09.721557 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:37:09.721541 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4\": container with ID starting with 94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4 not found: ID does not exist" containerID="94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4" Apr 24 21:37:09.721621 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.721565 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4"} err="failed to get container status \"94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4\": rpc error: code = NotFound desc = could not find container \"94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4\": container with ID starting with 94207a0524a714b4c330278dd04b07a9421bd9c56aa352c8725345f4a612c0c4 not found: ID does not exist" Apr 24 21:37:09.735126 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.735078 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ckdb9"] Apr 24 21:37:09.738763 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.738744 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ckdb9"] Apr 24 21:37:09.916064 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:09.916039 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="456eaa70-cc54-4488-a716-99e6f4946326" path="/var/lib/kubelet/pods/456eaa70-cc54-4488-a716-99e6f4946326/volumes" Apr 24 21:37:16.779397 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:16.779364 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-z4h84"] Apr 24 21:37:16.779808 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:16.779587 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" podUID="d73ef5f5-dd34-4238-bcb9-2a09b7168ab7" containerName="limitador" containerID="cri-o://124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c" gracePeriod=30 Apr 24 21:37:16.780230 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:16.780210 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:17.710196 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.710177 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:17.718293 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.718271 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-config-file\") pod \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\" (UID: \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\") " Apr 24 21:37:17.718363 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.718314 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqbw8\" (UniqueName: \"kubernetes.io/projected/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-kube-api-access-bqbw8\") pod \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\" (UID: \"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7\") " Apr 24 21:37:17.718583 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.718558 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-config-file" (OuterVolumeSpecName: "config-file") pod "d73ef5f5-dd34-4238-bcb9-2a09b7168ab7" (UID: "d73ef5f5-dd34-4238-bcb9-2a09b7168ab7"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:17.720215 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.720198 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-kube-api-access-bqbw8" (OuterVolumeSpecName: "kube-api-access-bqbw8") pod "d73ef5f5-dd34-4238-bcb9-2a09b7168ab7" (UID: "d73ef5f5-dd34-4238-bcb9-2a09b7168ab7"). InnerVolumeSpecName "kube-api-access-bqbw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:17.740489 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.740460 2568 generic.go:358] "Generic (PLEG): container finished" podID="d73ef5f5-dd34-4238-bcb9-2a09b7168ab7" containerID="124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c" exitCode=0 Apr 24 21:37:17.740591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.740511 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" Apr 24 21:37:17.740591 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.740533 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" event={"ID":"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7","Type":"ContainerDied","Data":"124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c"} Apr 24 21:37:17.740667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.740590 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-z4h84" event={"ID":"d73ef5f5-dd34-4238-bcb9-2a09b7168ab7","Type":"ContainerDied","Data":"d4b8f63c27fc27ceec2215e48ce531abc40be8147de9b65bf650f6e25efaeff2"} Apr 24 21:37:17.740667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.740610 2568 scope.go:117] "RemoveContainer" containerID="124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c" Apr 24 21:37:17.748586 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.748564 2568 scope.go:117] "RemoveContainer" containerID="124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c" Apr 24 21:37:17.749091 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:37:17.749070 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c\": container with ID starting with 124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c not found: ID does not exist" containerID="124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c" Apr 24 21:37:17.749178 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.749097 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c"} err="failed to get container status \"124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c\": rpc error: code = NotFound desc = could not find container \"124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c\": container with ID starting with 124169f77aae086605337fa38f9e46bf696ef08f92231f25a93760e02ecd123c not found: ID does not exist" Apr 24 21:37:17.763663 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.763644 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-z4h84"] Apr 24 21:37:17.770294 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.770277 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-z4h84"] Apr 24 21:37:17.819632 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.819605 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqbw8\" (UniqueName: \"kubernetes.io/projected/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-kube-api-access-bqbw8\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:37:17.819632 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.819627 2568 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7-config-file\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:37:17.916378 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:17.916300 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73ef5f5-dd34-4238-bcb9-2a09b7168ab7" path="/var/lib/kubelet/pods/d73ef5f5-dd34-4238-bcb9-2a09b7168ab7/volumes" Apr 24 21:37:25.968186 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.968151 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-jf4qz"] Apr 24 21:37:25.968565 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.968474 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="456eaa70-cc54-4488-a716-99e6f4946326" containerName="authorino" Apr 24 21:37:25.968565 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.968485 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="456eaa70-cc54-4488-a716-99e6f4946326" containerName="authorino" Apr 24 21:37:25.968565 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.968499 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d73ef5f5-dd34-4238-bcb9-2a09b7168ab7" containerName="limitador" Apr 24 21:37:25.968565 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.968505 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73ef5f5-dd34-4238-bcb9-2a09b7168ab7" containerName="limitador" Apr 24 21:37:25.968736 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.968571 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d73ef5f5-dd34-4238-bcb9-2a09b7168ab7" containerName="limitador" Apr 24 21:37:25.968736 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.968580 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="456eaa70-cc54-4488-a716-99e6f4946326" containerName="authorino" Apr 24 21:37:25.972650 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.972625 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-jf4qz" Apr 24 21:37:25.975835 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.975811 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 24 21:37:25.975962 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.975820 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-v2n49\"" Apr 24 21:37:25.978571 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:25.978549 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-jf4qz"] Apr 24 21:37:26.086565 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.086531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3597e4d1-468f-4024-bb70-46b7952accf1-tls-cert\") pod \"authorino-68bd676465-jf4qz\" (UID: \"3597e4d1-468f-4024-bb70-46b7952accf1\") " pod="kuadrant-system/authorino-68bd676465-jf4qz" Apr 24 21:37:26.086565 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.086566 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tvjm\" (UniqueName: \"kubernetes.io/projected/3597e4d1-468f-4024-bb70-46b7952accf1-kube-api-access-4tvjm\") pod \"authorino-68bd676465-jf4qz\" (UID: \"3597e4d1-468f-4024-bb70-46b7952accf1\") " pod="kuadrant-system/authorino-68bd676465-jf4qz" Apr 24 21:37:26.187600 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.187564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3597e4d1-468f-4024-bb70-46b7952accf1-tls-cert\") pod \"authorino-68bd676465-jf4qz\" (UID: \"3597e4d1-468f-4024-bb70-46b7952accf1\") " pod="kuadrant-system/authorino-68bd676465-jf4qz" Apr 24 21:37:26.187600 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.187600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tvjm\" (UniqueName: \"kubernetes.io/projected/3597e4d1-468f-4024-bb70-46b7952accf1-kube-api-access-4tvjm\") pod \"authorino-68bd676465-jf4qz\" (UID: \"3597e4d1-468f-4024-bb70-46b7952accf1\") " pod="kuadrant-system/authorino-68bd676465-jf4qz" Apr 24 21:37:26.189990 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.189969 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3597e4d1-468f-4024-bb70-46b7952accf1-tls-cert\") pod \"authorino-68bd676465-jf4qz\" (UID: \"3597e4d1-468f-4024-bb70-46b7952accf1\") " pod="kuadrant-system/authorino-68bd676465-jf4qz" Apr 24 21:37:26.196667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.196639 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tvjm\" (UniqueName: \"kubernetes.io/projected/3597e4d1-468f-4024-bb70-46b7952accf1-kube-api-access-4tvjm\") pod \"authorino-68bd676465-jf4qz\" (UID: \"3597e4d1-468f-4024-bb70-46b7952accf1\") " pod="kuadrant-system/authorino-68bd676465-jf4qz" Apr 24 21:37:26.284307 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.284285 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-jf4qz" Apr 24 21:37:26.401364 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.401312 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-jf4qz"] Apr 24 21:37:26.404184 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:37:26.404148 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3597e4d1_468f_4024_bb70_46b7952accf1.slice/crio-4c95f557ca3caa33e2750142d3806ad1cef92c7d96d105449a6e6fff200578d5 WatchSource:0}: Error finding container 4c95f557ca3caa33e2750142d3806ad1cef92c7d96d105449a6e6fff200578d5: Status 404 returned error can't find the container with id 4c95f557ca3caa33e2750142d3806ad1cef92c7d96d105449a6e6fff200578d5 Apr 24 21:37:26.774477 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:26.774433 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-jf4qz" event={"ID":"3597e4d1-468f-4024-bb70-46b7952accf1","Type":"ContainerStarted","Data":"4c95f557ca3caa33e2750142d3806ad1cef92c7d96d105449a6e6fff200578d5"} Apr 24 21:37:27.780642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:27.780603 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-jf4qz" event={"ID":"3597e4d1-468f-4024-bb70-46b7952accf1","Type":"ContainerStarted","Data":"856070ed0916c1152f295d003ecd3c78e624514c35096de7fa208cd0b810ab0f"} Apr 24 21:37:27.803841 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:27.803794 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-jf4qz" podStartSLOduration=2.3383638270000002 podStartE2EDuration="2.803780001s" podCreationTimestamp="2026-04-24 21:37:25 +0000 UTC" firstStartedPulling="2026-04-24 21:37:26.405346293 +0000 UTC m=+571.008415052" lastFinishedPulling="2026-04-24 21:37:26.870762449 +0000 UTC m=+571.473831226" observedRunningTime="2026-04-24 21:37:27.802937737 +0000 UTC m=+572.406006518" watchObservedRunningTime="2026-04-24 21:37:27.803780001 +0000 UTC m=+572.406848782" Apr 24 21:37:46.471626 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.471594 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-q9b4m"] Apr 24 21:37:46.491934 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.491900 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-q9b4m"] Apr 24 21:37:46.492087 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.492032 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:46.494999 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.494975 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:37:46.495114 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.494977 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-hxmbl\"" Apr 24 21:37:46.495114 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.494977 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:37:46.495788 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.495769 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:37:46.505661 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.505640 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6c58f78c97-glpn7"] Apr 24 21:37:46.509418 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.509400 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:46.512327 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.512305 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-8gbxr\"" Apr 24 21:37:46.520523 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.520488 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:37:46.522509 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.522484 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6c58f78c97-glpn7"] Apr 24 21:37:46.564344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.564313 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5p7t\" (UniqueName: \"kubernetes.io/projected/f7066048-823c-48f1-b8fb-d9939a4c45a4-kube-api-access-m5p7t\") pod \"kserve-controller-manager-67f77cd7d7-q9b4m\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:46.564479 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.564395 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert\") pod \"kserve-controller-manager-67f77cd7d7-q9b4m\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:46.665673 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.665641 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5p7t\" (UniqueName: \"kubernetes.io/projected/f7066048-823c-48f1-b8fb-d9939a4c45a4-kube-api-access-m5p7t\") pod \"kserve-controller-manager-67f77cd7d7-q9b4m\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:46.665840 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.665717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert\") pod \"kserve-controller-manager-67f77cd7d7-q9b4m\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:46.665840 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.665749 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4fhw\" (UniqueName: \"kubernetes.io/projected/30d4e8b1-4a7c-4475-965e-71681ac30d9e-kube-api-access-j4fhw\") pod \"llmisvc-controller-manager-6c58f78c97-glpn7\" (UID: \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\") " pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:46.665840 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.665778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30d4e8b1-4a7c-4475-965e-71681ac30d9e-cert\") pod \"llmisvc-controller-manager-6c58f78c97-glpn7\" (UID: \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\") " pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:46.665945 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:37:46.665847 2568 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 24 21:37:46.665945 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:37:46.665915 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert podName:f7066048-823c-48f1-b8fb-d9939a4c45a4 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:47.165899416 +0000 UTC m=+591.768968176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert") pod "kserve-controller-manager-67f77cd7d7-q9b4m" (UID: "f7066048-823c-48f1-b8fb-d9939a4c45a4") : secret "kserve-webhook-server-cert" not found Apr 24 21:37:46.674701 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.674681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5p7t\" (UniqueName: \"kubernetes.io/projected/f7066048-823c-48f1-b8fb-d9939a4c45a4-kube-api-access-m5p7t\") pod \"kserve-controller-manager-67f77cd7d7-q9b4m\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:46.766527 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.766492 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4fhw\" (UniqueName: \"kubernetes.io/projected/30d4e8b1-4a7c-4475-965e-71681ac30d9e-kube-api-access-j4fhw\") pod \"llmisvc-controller-manager-6c58f78c97-glpn7\" (UID: \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\") " pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:46.766527 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.766529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30d4e8b1-4a7c-4475-965e-71681ac30d9e-cert\") pod \"llmisvc-controller-manager-6c58f78c97-glpn7\" (UID: \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\") " pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:46.768940 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.768914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30d4e8b1-4a7c-4475-965e-71681ac30d9e-cert\") pod \"llmisvc-controller-manager-6c58f78c97-glpn7\" (UID: \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\") " pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:46.780685 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.780660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4fhw\" (UniqueName: \"kubernetes.io/projected/30d4e8b1-4a7c-4475-965e-71681ac30d9e-kube-api-access-j4fhw\") pod \"llmisvc-controller-manager-6c58f78c97-glpn7\" (UID: \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\") " pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:46.826784 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.826762 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:46.952307 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:46.952273 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6c58f78c97-glpn7"] Apr 24 21:37:46.955080 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:37:46.955052 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod30d4e8b1_4a7c_4475_965e_71681ac30d9e.slice/crio-b98f2c2a961e1d601ca32ba8dca95df0fc1da7dc8c676102376d8f38b793fc51 WatchSource:0}: Error finding container b98f2c2a961e1d601ca32ba8dca95df0fc1da7dc8c676102376d8f38b793fc51: Status 404 returned error can't find the container with id b98f2c2a961e1d601ca32ba8dca95df0fc1da7dc8c676102376d8f38b793fc51 Apr 24 21:37:47.170255 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:47.170182 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert\") pod \"kserve-controller-manager-67f77cd7d7-q9b4m\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:47.172431 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:47.172409 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert\") pod \"kserve-controller-manager-67f77cd7d7-q9b4m\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:47.402596 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:47.402567 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:47.553089 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:47.553065 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-q9b4m"] Apr 24 21:37:47.556149 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:37:47.556118 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7066048_823c_48f1_b8fb_d9939a4c45a4.slice/crio-4c8178d90746b623b154fc705710d8972eb73224695318cf3cfedfc5edc0cc13 WatchSource:0}: Error finding container 4c8178d90746b623b154fc705710d8972eb73224695318cf3cfedfc5edc0cc13: Status 404 returned error can't find the container with id 4c8178d90746b623b154fc705710d8972eb73224695318cf3cfedfc5edc0cc13 Apr 24 21:37:47.845316 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:47.845284 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" event={"ID":"f7066048-823c-48f1-b8fb-d9939a4c45a4","Type":"ContainerStarted","Data":"4c8178d90746b623b154fc705710d8972eb73224695318cf3cfedfc5edc0cc13"} Apr 24 21:37:47.846406 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:47.846376 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" event={"ID":"30d4e8b1-4a7c-4475-965e-71681ac30d9e","Type":"ContainerStarted","Data":"b98f2c2a961e1d601ca32ba8dca95df0fc1da7dc8c676102376d8f38b793fc51"} Apr 24 21:37:50.859444 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:50.859408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" event={"ID":"30d4e8b1-4a7c-4475-965e-71681ac30d9e","Type":"ContainerStarted","Data":"c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78"} Apr 24 21:37:50.859913 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:50.859464 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:37:50.860726 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:50.860685 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" event={"ID":"f7066048-823c-48f1-b8fb-d9939a4c45a4","Type":"ContainerStarted","Data":"68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2"} Apr 24 21:37:50.860862 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:50.860850 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:37:50.879138 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:50.879062 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" podStartSLOduration=1.22215175 podStartE2EDuration="4.879050841s" podCreationTimestamp="2026-04-24 21:37:46 +0000 UTC" firstStartedPulling="2026-04-24 21:37:46.956344086 +0000 UTC m=+591.559412849" lastFinishedPulling="2026-04-24 21:37:50.613243181 +0000 UTC m=+595.216311940" observedRunningTime="2026-04-24 21:37:50.877677907 +0000 UTC m=+595.480746725" watchObservedRunningTime="2026-04-24 21:37:50.879050841 +0000 UTC m=+595.482119622" Apr 24 21:37:50.899041 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:50.898996 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" podStartSLOduration=1.7975687200000001 podStartE2EDuration="4.898983887s" podCreationTimestamp="2026-04-24 21:37:46 +0000 UTC" firstStartedPulling="2026-04-24 21:37:47.557686967 +0000 UTC m=+592.160755726" lastFinishedPulling="2026-04-24 21:37:50.659102121 +0000 UTC m=+595.262170893" observedRunningTime="2026-04-24 21:37:50.896151069 +0000 UTC m=+595.499219849" watchObservedRunningTime="2026-04-24 21:37:50.898983887 +0000 UTC m=+595.502052667" Apr 24 21:37:55.828994 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:55.828965 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:37:55.829949 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:37:55.829922 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:38:21.865596 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:21.865516 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:38:21.868764 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:21.868747 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:38:23.232957 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.232919 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-q9b4m"] Apr 24 21:38:23.233409 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.233204 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" podUID="f7066048-823c-48f1-b8fb-d9939a4c45a4" containerName="manager" containerID="cri-o://68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2" gracePeriod=10 Apr 24 21:38:23.260654 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.260628 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-ffx4r"] Apr 24 21:38:23.312039 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.312012 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-ffx4r"] Apr 24 21:38:23.312148 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.312120 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:23.383747 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.383699 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0d4508-2f84-4216-af5f-dd2e9f0cd920-cert\") pod \"kserve-controller-manager-67f77cd7d7-ffx4r\" (UID: \"ec0d4508-2f84-4216-af5f-dd2e9f0cd920\") " pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:23.383839 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.383816 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmgp5\" (UniqueName: \"kubernetes.io/projected/ec0d4508-2f84-4216-af5f-dd2e9f0cd920-kube-api-access-pmgp5\") pod \"kserve-controller-manager-67f77cd7d7-ffx4r\" (UID: \"ec0d4508-2f84-4216-af5f-dd2e9f0cd920\") " pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:23.484465 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.484393 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmgp5\" (UniqueName: \"kubernetes.io/projected/ec0d4508-2f84-4216-af5f-dd2e9f0cd920-kube-api-access-pmgp5\") pod \"kserve-controller-manager-67f77cd7d7-ffx4r\" (UID: \"ec0d4508-2f84-4216-af5f-dd2e9f0cd920\") " pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:23.484618 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.484501 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0d4508-2f84-4216-af5f-dd2e9f0cd920-cert\") pod \"kserve-controller-manager-67f77cd7d7-ffx4r\" (UID: \"ec0d4508-2f84-4216-af5f-dd2e9f0cd920\") " pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:23.487056 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.487026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0d4508-2f84-4216-af5f-dd2e9f0cd920-cert\") pod \"kserve-controller-manager-67f77cd7d7-ffx4r\" (UID: \"ec0d4508-2f84-4216-af5f-dd2e9f0cd920\") " pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:23.494137 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.494105 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmgp5\" (UniqueName: \"kubernetes.io/projected/ec0d4508-2f84-4216-af5f-dd2e9f0cd920-kube-api-access-pmgp5\") pod \"kserve-controller-manager-67f77cd7d7-ffx4r\" (UID: \"ec0d4508-2f84-4216-af5f-dd2e9f0cd920\") " pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:23.505166 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.505147 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:38:23.585954 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.585920 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5p7t\" (UniqueName: \"kubernetes.io/projected/f7066048-823c-48f1-b8fb-d9939a4c45a4-kube-api-access-m5p7t\") pod \"f7066048-823c-48f1-b8fb-d9939a4c45a4\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " Apr 24 21:38:23.585954 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.585955 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert\") pod \"f7066048-823c-48f1-b8fb-d9939a4c45a4\" (UID: \"f7066048-823c-48f1-b8fb-d9939a4c45a4\") " Apr 24 21:38:23.588021 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.587988 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7066048-823c-48f1-b8fb-d9939a4c45a4-kube-api-access-m5p7t" (OuterVolumeSpecName: "kube-api-access-m5p7t") pod "f7066048-823c-48f1-b8fb-d9939a4c45a4" (UID: "f7066048-823c-48f1-b8fb-d9939a4c45a4"). InnerVolumeSpecName "kube-api-access-m5p7t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:38:23.588133 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.588041 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert" (OuterVolumeSpecName: "cert") pod "f7066048-823c-48f1-b8fb-d9939a4c45a4" (UID: "f7066048-823c-48f1-b8fb-d9939a4c45a4"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:38:23.669582 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.669551 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:23.686551 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.686527 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5p7t\" (UniqueName: \"kubernetes.io/projected/f7066048-823c-48f1-b8fb-d9939a4c45a4-kube-api-access-m5p7t\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:38:23.686551 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.686552 2568 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7066048-823c-48f1-b8fb-d9939a4c45a4-cert\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:38:23.785151 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.784916 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-ffx4r"] Apr 24 21:38:23.787359 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:38:23.787332 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0d4508_2f84_4216_af5f_dd2e9f0cd920.slice/crio-b10bd7624ca1573eaa4e7da2f8a2158bd3cfcaa503ccdd9ed6e81418b53f6382 WatchSource:0}: Error finding container b10bd7624ca1573eaa4e7da2f8a2158bd3cfcaa503ccdd9ed6e81418b53f6382: Status 404 returned error can't find the container with id b10bd7624ca1573eaa4e7da2f8a2158bd3cfcaa503ccdd9ed6e81418b53f6382 Apr 24 21:38:23.974168 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.974132 2568 generic.go:358] "Generic (PLEG): container finished" podID="f7066048-823c-48f1-b8fb-d9939a4c45a4" containerID="68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2" exitCode=0 Apr 24 21:38:23.974328 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.974197 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" Apr 24 21:38:23.974328 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.974208 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" event={"ID":"f7066048-823c-48f1-b8fb-d9939a4c45a4","Type":"ContainerDied","Data":"68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2"} Apr 24 21:38:23.974328 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.974240 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-q9b4m" event={"ID":"f7066048-823c-48f1-b8fb-d9939a4c45a4","Type":"ContainerDied","Data":"4c8178d90746b623b154fc705710d8972eb73224695318cf3cfedfc5edc0cc13"} Apr 24 21:38:23.974328 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.974261 2568 scope.go:117] "RemoveContainer" containerID="68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2" Apr 24 21:38:23.975321 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.975300 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" event={"ID":"ec0d4508-2f84-4216-af5f-dd2e9f0cd920","Type":"ContainerStarted","Data":"b10bd7624ca1573eaa4e7da2f8a2158bd3cfcaa503ccdd9ed6e81418b53f6382"} Apr 24 21:38:23.981899 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.981871 2568 scope.go:117] "RemoveContainer" containerID="68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2" Apr 24 21:38:23.982148 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:38:23.982126 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2\": container with ID starting with 68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2 not found: ID does not exist" containerID="68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2" Apr 24 21:38:23.982207 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.982161 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2"} err="failed to get container status \"68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2\": rpc error: code = NotFound desc = could not find container \"68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2\": container with ID starting with 68a8ddafd7b99838de83131776aad06573c3a1e2a99558053a13b72299ff79d2 not found: ID does not exist" Apr 24 21:38:23.992285 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.992263 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-q9b4m"] Apr 24 21:38:23.996590 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:23.996572 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-q9b4m"] Apr 24 21:38:24.981391 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:24.981357 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" event={"ID":"ec0d4508-2f84-4216-af5f-dd2e9f0cd920","Type":"ContainerStarted","Data":"59c13156773b88a07f8b9d41dfb8d771841cf33cd4ae0f2b448bebb9ca6b5b42"} Apr 24 21:38:24.981824 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:24.981406 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:38:24.999096 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:24.999047 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" podStartSLOduration=1.35932817 podStartE2EDuration="1.999006921s" podCreationTimestamp="2026-04-24 21:38:23 +0000 UTC" firstStartedPulling="2026-04-24 21:38:23.788584209 +0000 UTC m=+628.391652969" lastFinishedPulling="2026-04-24 21:38:24.428262961 +0000 UTC m=+629.031331720" observedRunningTime="2026-04-24 21:38:24.998464752 +0000 UTC m=+629.601533534" watchObservedRunningTime="2026-04-24 21:38:24.999006921 +0000 UTC m=+629.602075703" Apr 24 21:38:25.916563 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:25.916526 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7066048-823c-48f1-b8fb-d9939a4c45a4" path="/var/lib/kubelet/pods/f7066048-823c-48f1-b8fb-d9939a4c45a4/volumes" Apr 24 21:38:55.989185 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:38:55.989155 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-67f77cd7d7-ffx4r" Apr 24 21:40:01.086020 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.085985 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp"] Apr 24 21:40:01.086450 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.086357 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7066048-823c-48f1-b8fb-d9939a4c45a4" containerName="manager" Apr 24 21:40:01.086450 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.086368 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7066048-823c-48f1-b8fb-d9939a4c45a4" containerName="manager" Apr 24 21:40:01.086450 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.086445 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7066048-823c-48f1-b8fb-d9939a4c45a4" containerName="manager" Apr 24 21:40:01.089592 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.089574 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.094791 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.094766 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:40:01.094897 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.094766 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:40:01.094897 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.094821 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:40:01.094897 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.094766 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 24 21:40:01.094897 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.094765 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-xhwwp\"" Apr 24 21:40:01.104102 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.104080 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp"] Apr 24 21:40:01.214409 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.214375 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.214589 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.214434 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.214589 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.214473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d23d5eea-f505-4a38-bdca-3fdf04064c16-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.214589 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.214555 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.214589 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.214584 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.214810 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.214615 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhstz\" (UniqueName: \"kubernetes.io/projected/d23d5eea-f505-4a38-bdca-3fdf04064c16-kube-api-access-mhstz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315183 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhstz\" (UniqueName: \"kubernetes.io/projected/d23d5eea-f505-4a38-bdca-3fdf04064c16-kube-api-access-mhstz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315367 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315200 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315367 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315233 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315367 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315265 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d23d5eea-f505-4a38-bdca-3fdf04064c16-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315367 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315367 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315696 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315673 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315799 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315806 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.315838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.315780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.317731 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.317699 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d23d5eea-f505-4a38-bdca-3fdf04064c16-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.325117 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.325093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhstz\" (UniqueName: \"kubernetes.io/projected/d23d5eea-f505-4a38-bdca-3fdf04064c16-kube-api-access-mhstz\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.399435 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.399361 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:01.523848 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.523787 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp"] Apr 24 21:40:01.526506 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:40:01.526474 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd23d5eea_f505_4a38_bdca_3fdf04064c16.slice/crio-074dc0f2801dae1de8363fcb542b3b174152db34ae7c646a27a88db3b09b4c8e WatchSource:0}: Error finding container 074dc0f2801dae1de8363fcb542b3b174152db34ae7c646a27a88db3b09b4c8e: Status 404 returned error can't find the container with id 074dc0f2801dae1de8363fcb542b3b174152db34ae7c646a27a88db3b09b4c8e Apr 24 21:40:01.528335 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:01.528318 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:40:02.300110 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:02.300073 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" event={"ID":"d23d5eea-f505-4a38-bdca-3fdf04064c16","Type":"ContainerStarted","Data":"074dc0f2801dae1de8363fcb542b3b174152db34ae7c646a27a88db3b09b4c8e"} Apr 24 21:40:05.312381 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:05.312290 2568 generic.go:358] "Generic (PLEG): container finished" podID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerID="ac02edc4768385338d5a51e8282d8a6a7a1ea7948f8c476d18e6109a49e4849d" exitCode=0 Apr 24 21:40:05.312381 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:05.312354 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" event={"ID":"d23d5eea-f505-4a38-bdca-3fdf04064c16","Type":"ContainerDied","Data":"ac02edc4768385338d5a51e8282d8a6a7a1ea7948f8c476d18e6109a49e4849d"} Apr 24 21:40:07.321820 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:07.321752 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" event={"ID":"d23d5eea-f505-4a38-bdca-3fdf04064c16","Type":"ContainerStarted","Data":"49376e4d127e94caf8d11fef1d524ccdff1a59fcf5e59879ef43573dd5f7b24c"} Apr 24 21:40:36.437290 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:36.437245 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" event={"ID":"d23d5eea-f505-4a38-bdca-3fdf04064c16","Type":"ContainerStarted","Data":"1c950e55483497f9524e0bab44c1ffc1b87a4600bf4f6bc4dd8a6d918d248ddc"} Apr 24 21:40:36.437763 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:36.437498 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:36.440103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:36.440083 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:36.459558 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:36.459511 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" podStartSLOduration=0.701086835 podStartE2EDuration="35.459499083s" podCreationTimestamp="2026-04-24 21:40:01 +0000 UTC" firstStartedPulling="2026-04-24 21:40:01.528506054 +0000 UTC m=+726.131574814" lastFinishedPulling="2026-04-24 21:40:36.286918301 +0000 UTC m=+760.889987062" observedRunningTime="2026-04-24 21:40:36.457246661 +0000 UTC m=+761.060315445" watchObservedRunningTime="2026-04-24 21:40:36.459499083 +0000 UTC m=+761.062567864" Apr 24 21:40:41.399900 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:41.399861 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:41.399900 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:41.399905 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:41.400374 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:41.400160 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.32:8082/healthz\": dial tcp 10.134.0.32:8082: connect: connection refused" Apr 24 21:40:51.401827 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:51.401793 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:40:51.402894 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:40:51.402875 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:42:32.805151 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:32.805109 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp"] Apr 24 21:42:32.805747 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:32.805481 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="main" containerID="cri-o://49376e4d127e94caf8d11fef1d524ccdff1a59fcf5e59879ef43573dd5f7b24c" gracePeriod=30 Apr 24 21:42:32.805747 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:32.805528 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="tokenizer" containerID="cri-o://1c950e55483497f9524e0bab44c1ffc1b87a4600bf4f6bc4dd8a6d918d248ddc" gracePeriod=30 Apr 24 21:42:33.830951 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:33.830922 2568 generic.go:358] "Generic (PLEG): container finished" podID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerID="1c950e55483497f9524e0bab44c1ffc1b87a4600bf4f6bc4dd8a6d918d248ddc" exitCode=0 Apr 24 21:42:33.830951 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:33.830947 2568 generic.go:358] "Generic (PLEG): container finished" podID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerID="49376e4d127e94caf8d11fef1d524ccdff1a59fcf5e59879ef43573dd5f7b24c" exitCode=0 Apr 24 21:42:33.831391 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:33.830990 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" event={"ID":"d23d5eea-f505-4a38-bdca-3fdf04064c16","Type":"ContainerDied","Data":"1c950e55483497f9524e0bab44c1ffc1b87a4600bf4f6bc4dd8a6d918d248ddc"} Apr 24 21:42:33.831391 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:33.831027 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" event={"ID":"d23d5eea-f505-4a38-bdca-3fdf04064c16","Type":"ContainerDied","Data":"49376e4d127e94caf8d11fef1d524ccdff1a59fcf5e59879ef43573dd5f7b24c"} Apr 24 21:42:33.950694 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:33.950672 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:42:34.046628 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.046599 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d23d5eea-f505-4a38-bdca-3fdf04064c16-tls-certs\") pod \"d23d5eea-f505-4a38-bdca-3fdf04064c16\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " Apr 24 21:42:34.046803 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.046646 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-uds\") pod \"d23d5eea-f505-4a38-bdca-3fdf04064c16\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " Apr 24 21:42:34.046803 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.046692 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-kserve-provision-location\") pod \"d23d5eea-f505-4a38-bdca-3fdf04064c16\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " Apr 24 21:42:34.046803 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.046744 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-tmp\") pod \"d23d5eea-f505-4a38-bdca-3fdf04064c16\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " Apr 24 21:42:34.046970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.046841 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-cache\") pod \"d23d5eea-f505-4a38-bdca-3fdf04064c16\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " Apr 24 21:42:34.046970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.046877 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhstz\" (UniqueName: \"kubernetes.io/projected/d23d5eea-f505-4a38-bdca-3fdf04064c16-kube-api-access-mhstz\") pod \"d23d5eea-f505-4a38-bdca-3fdf04064c16\" (UID: \"d23d5eea-f505-4a38-bdca-3fdf04064c16\") " Apr 24 21:42:34.046970 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.046924 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d23d5eea-f505-4a38-bdca-3fdf04064c16" (UID: "d23d5eea-f505-4a38-bdca-3fdf04064c16"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:34.047146 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.047119 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d23d5eea-f505-4a38-bdca-3fdf04064c16" (UID: "d23d5eea-f505-4a38-bdca-3fdf04064c16"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:34.047198 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.047136 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d23d5eea-f505-4a38-bdca-3fdf04064c16" (UID: "d23d5eea-f505-4a38-bdca-3fdf04064c16"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:34.047198 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.047134 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:42:34.047402 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.047380 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d23d5eea-f505-4a38-bdca-3fdf04064c16" (UID: "d23d5eea-f505-4a38-bdca-3fdf04064c16"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:34.048909 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.048887 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23d5eea-f505-4a38-bdca-3fdf04064c16-kube-api-access-mhstz" (OuterVolumeSpecName: "kube-api-access-mhstz") pod "d23d5eea-f505-4a38-bdca-3fdf04064c16" (UID: "d23d5eea-f505-4a38-bdca-3fdf04064c16"). InnerVolumeSpecName "kube-api-access-mhstz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:42:34.048975 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.048934 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23d5eea-f505-4a38-bdca-3fdf04064c16-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d23d5eea-f505-4a38-bdca-3fdf04064c16" (UID: "d23d5eea-f505-4a38-bdca-3fdf04064c16"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:42:34.147693 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.147659 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:42:34.147693 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.147692 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mhstz\" (UniqueName: \"kubernetes.io/projected/d23d5eea-f505-4a38-bdca-3fdf04064c16-kube-api-access-mhstz\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:42:34.147857 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.147734 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d23d5eea-f505-4a38-bdca-3fdf04064c16-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:42:34.147857 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.147748 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:42:34.147857 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.147760 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d23d5eea-f505-4a38-bdca-3fdf04064c16-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:42:34.835433 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.835392 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" event={"ID":"d23d5eea-f505-4a38-bdca-3fdf04064c16","Type":"ContainerDied","Data":"074dc0f2801dae1de8363fcb542b3b174152db34ae7c646a27a88db3b09b4c8e"} Apr 24 21:42:34.835433 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.835438 2568 scope.go:117] "RemoveContainer" containerID="1c950e55483497f9524e0bab44c1ffc1b87a4600bf4f6bc4dd8a6d918d248ddc" Apr 24 21:42:34.835993 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.835463 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp" Apr 24 21:42:34.843802 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.843785 2568 scope.go:117] "RemoveContainer" containerID="49376e4d127e94caf8d11fef1d524ccdff1a59fcf5e59879ef43573dd5f7b24c" Apr 24 21:42:34.850912 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.850895 2568 scope.go:117] "RemoveContainer" containerID="ac02edc4768385338d5a51e8282d8a6a7a1ea7948f8c476d18e6109a49e4849d" Apr 24 21:42:34.864333 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.864309 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp"] Apr 24 21:42:34.867927 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:34.867908 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5ff44t5nwp"] Apr 24 21:42:35.916416 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:35.916380 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" path="/var/lib/kubelet/pods/d23d5eea-f505-4a38-bdca-3fdf04064c16/volumes" Apr 24 21:42:37.289928 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.289892 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v"] Apr 24 21:42:37.290335 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.290319 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="storage-initializer" Apr 24 21:42:37.290379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.290336 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="storage-initializer" Apr 24 21:42:37.290379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.290349 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="main" Apr 24 21:42:37.290379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.290355 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="main" Apr 24 21:42:37.290379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.290364 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="tokenizer" Apr 24 21:42:37.290379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.290369 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="tokenizer" Apr 24 21:42:37.290529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.290419 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="tokenizer" Apr 24 21:42:37.290529 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.290429 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d23d5eea-f505-4a38-bdca-3fdf04064c16" containerName="main" Apr 24 21:42:37.295204 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.295186 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.297534 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.297514 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 24 21:42:37.297630 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.297601 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-2tvkt\"" Apr 24 21:42:37.298323 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.298304 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:42:37.298414 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.298396 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:42:37.298476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.298465 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:42:37.306934 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.306911 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v"] Apr 24 21:42:37.375060 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.375035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.375207 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.375076 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.375207 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.375096 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.375304 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.375201 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lgdv\" (UniqueName: \"kubernetes.io/projected/40518c77-64b8-45d1-8435-8330f22ccc06-kube-api-access-9lgdv\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.375304 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.375228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.375304 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.375263 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40518c77-64b8-45d1-8435-8330f22ccc06-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476294 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lgdv\" (UniqueName: \"kubernetes.io/projected/40518c77-64b8-45d1-8435-8330f22ccc06-kube-api-access-9lgdv\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476427 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476427 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40518c77-64b8-45d1-8435-8330f22ccc06-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476427 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476414 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476596 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476458 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476596 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476734 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476689 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476788 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476840 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476819 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.476875 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.476859 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.478791 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.478775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40518c77-64b8-45d1-8435-8330f22ccc06-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.486699 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.486680 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lgdv\" (UniqueName: \"kubernetes.io/projected/40518c77-64b8-45d1-8435-8330f22ccc06-kube-api-access-9lgdv\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.604927 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.604855 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:37.729184 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.729005 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v"] Apr 24 21:42:37.732077 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:42:37.732047 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40518c77_64b8_45d1_8435_8330f22ccc06.slice/crio-7acd4b3e965715ca9e6fea23ed891f825601c413542ea426269aa2c1075c1da9 WatchSource:0}: Error finding container 7acd4b3e965715ca9e6fea23ed891f825601c413542ea426269aa2c1075c1da9: Status 404 returned error can't find the container with id 7acd4b3e965715ca9e6fea23ed891f825601c413542ea426269aa2c1075c1da9 Apr 24 21:42:37.851904 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.851834 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" event={"ID":"40518c77-64b8-45d1-8435-8330f22ccc06","Type":"ContainerStarted","Data":"ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543"} Apr 24 21:42:37.851904 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:37.851879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" event={"ID":"40518c77-64b8-45d1-8435-8330f22ccc06","Type":"ContainerStarted","Data":"7acd4b3e965715ca9e6fea23ed891f825601c413542ea426269aa2c1075c1da9"} Apr 24 21:42:38.857757 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:38.857696 2568 generic.go:358] "Generic (PLEG): container finished" podID="40518c77-64b8-45d1-8435-8330f22ccc06" containerID="ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543" exitCode=0 Apr 24 21:42:38.857757 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:38.857739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" event={"ID":"40518c77-64b8-45d1-8435-8330f22ccc06","Type":"ContainerDied","Data":"ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543"} Apr 24 21:42:39.864074 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:39.864038 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" event={"ID":"40518c77-64b8-45d1-8435-8330f22ccc06","Type":"ContainerStarted","Data":"cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485"} Apr 24 21:42:39.864074 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:39.864076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" event={"ID":"40518c77-64b8-45d1-8435-8330f22ccc06","Type":"ContainerStarted","Data":"3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739"} Apr 24 21:42:39.864458 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:39.864139 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:39.890111 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:39.890043 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" podStartSLOduration=2.89002961 podStartE2EDuration="2.89002961s" podCreationTimestamp="2026-04-24 21:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:42:39.887687433 +0000 UTC m=+884.490756213" watchObservedRunningTime="2026-04-24 21:42:39.89002961 +0000 UTC m=+884.493098392" Apr 24 21:42:47.605483 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:47.605390 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:47.605483 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:47.605447 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:47.607989 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:47.607962 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:47.892727 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:47.892622 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:42:55.853748 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:55.853721 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:42:55.855482 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:42:55.855450 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:43:08.895933 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:08.895907 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:43:53.057168 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.057135 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw"] Apr 24 21:43:53.060717 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.060687 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.063896 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.063873 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-jk8fl\"" Apr 24 21:43:53.064026 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.063921 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 24 21:43:53.071476 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.071455 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw"] Apr 24 21:43:53.140253 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.140230 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dff58802-382a-4e33-ae62-fb58c287e7ad-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.140368 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.140265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.140368 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.140299 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.140448 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.140367 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhnk\" (UniqueName: \"kubernetes.io/projected/dff58802-382a-4e33-ae62-fb58c287e7ad-kube-api-access-qbhnk\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.140448 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.140409 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.140448 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.140444 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.240915 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.240876 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dff58802-382a-4e33-ae62-fb58c287e7ad-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241085 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.240926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241085 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.240960 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241085 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.240996 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhnk\" (UniqueName: \"kubernetes.io/projected/dff58802-382a-4e33-ae62-fb58c287e7ad-kube-api-access-qbhnk\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241085 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.241025 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241294 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.241156 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241448 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.241418 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241602 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.241572 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241602 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.241448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.241721 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.241471 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.243388 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.243370 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dff58802-382a-4e33-ae62-fb58c287e7ad-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.252388 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.252364 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhnk\" (UniqueName: \"kubernetes.io/projected/dff58802-382a-4e33-ae62-fb58c287e7ad-kube-api-access-qbhnk\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.371259 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.371186 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:53.494129 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:53.494099 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw"] Apr 24 21:43:53.497435 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:43:53.497408 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff58802_382a_4e33_ae62_fb58c287e7ad.slice/crio-db1cb12d64d78c8797d4f820fbc68ac7dec07ab917f7e92addde123eba596c4e WatchSource:0}: Error finding container db1cb12d64d78c8797d4f820fbc68ac7dec07ab917f7e92addde123eba596c4e: Status 404 returned error can't find the container with id db1cb12d64d78c8797d4f820fbc68ac7dec07ab917f7e92addde123eba596c4e Apr 24 21:43:54.117764 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:54.117663 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" event={"ID":"dff58802-382a-4e33-ae62-fb58c287e7ad","Type":"ContainerStarted","Data":"5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28"} Apr 24 21:43:54.117764 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:54.117733 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" event={"ID":"dff58802-382a-4e33-ae62-fb58c287e7ad","Type":"ContainerStarted","Data":"db1cb12d64d78c8797d4f820fbc68ac7dec07ab917f7e92addde123eba596c4e"} Apr 24 21:43:55.123242 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:55.123207 2568 generic.go:358] "Generic (PLEG): container finished" podID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerID="5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28" exitCode=0 Apr 24 21:43:55.123666 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:55.123295 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" event={"ID":"dff58802-382a-4e33-ae62-fb58c287e7ad","Type":"ContainerDied","Data":"5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28"} Apr 24 21:43:56.128821 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:56.128782 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" event={"ID":"dff58802-382a-4e33-ae62-fb58c287e7ad","Type":"ContainerStarted","Data":"0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90"} Apr 24 21:43:56.128821 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:56.128821 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" event={"ID":"dff58802-382a-4e33-ae62-fb58c287e7ad","Type":"ContainerStarted","Data":"798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0"} Apr 24 21:43:56.129223 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:56.128912 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:43:56.151168 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:43:56.151118 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" podStartSLOduration=3.151104864 podStartE2EDuration="3.151104864s" podCreationTimestamp="2026-04-24 21:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:43:56.149876471 +0000 UTC m=+960.752945253" watchObservedRunningTime="2026-04-24 21:43:56.151104864 +0000 UTC m=+960.754173645" Apr 24 21:44:03.371835 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:03.371797 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:44:03.372223 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:03.371852 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:44:03.374776 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:03.374748 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:44:04.158046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:04.158014 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:44:11.307218 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:11.307186 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v"] Apr 24 21:44:11.307667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:11.307577 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="main" containerID="cri-o://3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739" gracePeriod=30 Apr 24 21:44:11.307753 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:11.307677 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="tokenizer" containerID="cri-o://cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485" gracePeriod=30 Apr 24 21:44:12.184528 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.184497 2568 generic.go:358] "Generic (PLEG): container finished" podID="40518c77-64b8-45d1-8435-8330f22ccc06" containerID="3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739" exitCode=0 Apr 24 21:44:12.184693 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.184552 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" event={"ID":"40518c77-64b8-45d1-8435-8330f22ccc06","Type":"ContainerDied","Data":"3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739"} Apr 24 21:44:12.460805 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.460782 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:44:12.524486 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.524454 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lgdv\" (UniqueName: \"kubernetes.io/projected/40518c77-64b8-45d1-8435-8330f22ccc06-kube-api-access-9lgdv\") pod \"40518c77-64b8-45d1-8435-8330f22ccc06\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " Apr 24 21:44:12.524652 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.524492 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40518c77-64b8-45d1-8435-8330f22ccc06-tls-certs\") pod \"40518c77-64b8-45d1-8435-8330f22ccc06\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " Apr 24 21:44:12.524652 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.524531 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-cache\") pod \"40518c77-64b8-45d1-8435-8330f22ccc06\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " Apr 24 21:44:12.524652 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.524573 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-uds\") pod \"40518c77-64b8-45d1-8435-8330f22ccc06\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " Apr 24 21:44:12.524652 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.524641 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-kserve-provision-location\") pod \"40518c77-64b8-45d1-8435-8330f22ccc06\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " Apr 24 21:44:12.524916 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.524676 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-tmp\") pod \"40518c77-64b8-45d1-8435-8330f22ccc06\" (UID: \"40518c77-64b8-45d1-8435-8330f22ccc06\") " Apr 24 21:44:12.524916 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.524864 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "40518c77-64b8-45d1-8435-8330f22ccc06" (UID: "40518c77-64b8-45d1-8435-8330f22ccc06"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:12.524916 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.524890 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "40518c77-64b8-45d1-8435-8330f22ccc06" (UID: "40518c77-64b8-45d1-8435-8330f22ccc06"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:12.525061 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.525002 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:44:12.525061 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.525022 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:44:12.525150 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.525091 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "40518c77-64b8-45d1-8435-8330f22ccc06" (UID: "40518c77-64b8-45d1-8435-8330f22ccc06"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:12.525431 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.525412 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "40518c77-64b8-45d1-8435-8330f22ccc06" (UID: "40518c77-64b8-45d1-8435-8330f22ccc06"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:12.526602 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.526586 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40518c77-64b8-45d1-8435-8330f22ccc06-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "40518c77-64b8-45d1-8435-8330f22ccc06" (UID: "40518c77-64b8-45d1-8435-8330f22ccc06"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:44:12.526695 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.526673 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40518c77-64b8-45d1-8435-8330f22ccc06-kube-api-access-9lgdv" (OuterVolumeSpecName: "kube-api-access-9lgdv") pod "40518c77-64b8-45d1-8435-8330f22ccc06" (UID: "40518c77-64b8-45d1-8435-8330f22ccc06"). InnerVolumeSpecName "kube-api-access-9lgdv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:44:12.625452 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.625411 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9lgdv\" (UniqueName: \"kubernetes.io/projected/40518c77-64b8-45d1-8435-8330f22ccc06-kube-api-access-9lgdv\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:44:12.625452 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.625443 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/40518c77-64b8-45d1-8435-8330f22ccc06-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:44:12.625452 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.625456 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:44:12.625673 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:12.625469 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/40518c77-64b8-45d1-8435-8330f22ccc06-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:44:13.190004 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.189968 2568 generic.go:358] "Generic (PLEG): container finished" podID="40518c77-64b8-45d1-8435-8330f22ccc06" containerID="cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485" exitCode=0 Apr 24 21:44:13.190160 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.190020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" event={"ID":"40518c77-64b8-45d1-8435-8330f22ccc06","Type":"ContainerDied","Data":"cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485"} Apr 24 21:44:13.190160 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.190034 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" Apr 24 21:44:13.190160 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.190059 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v" event={"ID":"40518c77-64b8-45d1-8435-8330f22ccc06","Type":"ContainerDied","Data":"7acd4b3e965715ca9e6fea23ed891f825601c413542ea426269aa2c1075c1da9"} Apr 24 21:44:13.190160 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.190077 2568 scope.go:117] "RemoveContainer" containerID="cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485" Apr 24 21:44:13.198815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.198797 2568 scope.go:117] "RemoveContainer" containerID="3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739" Apr 24 21:44:13.205760 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.205695 2568 scope.go:117] "RemoveContainer" containerID="ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543" Apr 24 21:44:13.212329 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.212311 2568 scope.go:117] "RemoveContainer" containerID="cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485" Apr 24 21:44:13.212554 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:44:13.212537 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485\": container with ID starting with cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485 not found: ID does not exist" containerID="cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485" Apr 24 21:44:13.212619 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.212561 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485"} err="failed to get container status \"cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485\": rpc error: code = NotFound desc = could not find container \"cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485\": container with ID starting with cb3c9b40c13374b892a1d5a209045d9c485876eff7ba631e0ee76c2cb0441485 not found: ID does not exist" Apr 24 21:44:13.212619 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.212577 2568 scope.go:117] "RemoveContainer" containerID="3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739" Apr 24 21:44:13.212909 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:44:13.212880 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739\": container with ID starting with 3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739 not found: ID does not exist" containerID="3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739" Apr 24 21:44:13.213026 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.212914 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739"} err="failed to get container status \"3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739\": rpc error: code = NotFound desc = could not find container \"3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739\": container with ID starting with 3937cb8a8414e93b38d205d8309d7f1c03f959706821068efa12279b5ab3f739 not found: ID does not exist" Apr 24 21:44:13.213026 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.212936 2568 scope.go:117] "RemoveContainer" containerID="ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543" Apr 24 21:44:13.213399 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:44:13.213369 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543\": container with ID starting with ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543 not found: ID does not exist" containerID="ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543" Apr 24 21:44:13.213527 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.213406 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543"} err="failed to get container status \"ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543\": rpc error: code = NotFound desc = could not find container \"ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543\": container with ID starting with ecf11722bee47d9a32a2ac687d820badd46690d49989fa190e06bb8d94a2a543 not found: ID does not exist" Apr 24 21:44:13.215180 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.215154 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v"] Apr 24 21:44:13.220907 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.220888 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74775wbm7v"] Apr 24 21:44:13.916379 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:13.916340 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" path="/var/lib/kubelet/pods/40518c77-64b8-45d1-8435-8330f22ccc06/volumes" Apr 24 21:44:25.161794 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:25.161722 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:44:38.494299 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494265 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf"] Apr 24 21:44:38.494657 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494628 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="main" Apr 24 21:44:38.494657 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494640 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="main" Apr 24 21:44:38.494657 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494654 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="tokenizer" Apr 24 21:44:38.494815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494661 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="tokenizer" Apr 24 21:44:38.494815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494684 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="storage-initializer" Apr 24 21:44:38.494815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494689 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="storage-initializer" Apr 24 21:44:38.494815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494770 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="tokenizer" Apr 24 21:44:38.494815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.494785 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="40518c77-64b8-45d1-8435-8330f22ccc06" containerName="main" Apr 24 21:44:38.498129 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.498110 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.500444 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.500423 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 21:44:38.500519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.500441 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-w6fwt\"" Apr 24 21:44:38.509777 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.509751 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf"] Apr 24 21:44:38.646737 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.646677 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgnj2\" (UniqueName: \"kubernetes.io/projected/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kube-api-access-hgnj2\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.646910 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.646799 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.646910 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.646870 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.646910 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.646899 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.647035 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.646928 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.647035 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.646950 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748211 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748110 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748211 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748197 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748211 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748222 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748242 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748494 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgnj2\" (UniqueName: \"kubernetes.io/projected/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kube-api-access-hgnj2\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748593 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748604 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748775 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748673 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.748775 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.748697 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.750811 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.750787 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.756758 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.756737 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgnj2\" (UniqueName: \"kubernetes.io/projected/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kube-api-access-hgnj2\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.808555 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.808516 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:38.935421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:38.935392 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf"] Apr 24 21:44:38.938095 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:44:38.938065 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaefbeedc_4e73_4dd4_b28f_9c545cc9b622.slice/crio-fb7573f9f934ef666824fd634e2840e09de25c04d2bf674f51eaa550e1965b12 WatchSource:0}: Error finding container fb7573f9f934ef666824fd634e2840e09de25c04d2bf674f51eaa550e1965b12: Status 404 returned error can't find the container with id fb7573f9f934ef666824fd634e2840e09de25c04d2bf674f51eaa550e1965b12 Apr 24 21:44:39.279175 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:39.279136 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" event={"ID":"aefbeedc-4e73-4dd4-b28f-9c545cc9b622","Type":"ContainerStarted","Data":"50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643"} Apr 24 21:44:39.279175 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:39.279174 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" event={"ID":"aefbeedc-4e73-4dd4-b28f-9c545cc9b622","Type":"ContainerStarted","Data":"fb7573f9f934ef666824fd634e2840e09de25c04d2bf674f51eaa550e1965b12"} Apr 24 21:44:40.283112 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:40.283072 2568 generic.go:358] "Generic (PLEG): container finished" podID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerID="50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643" exitCode=0 Apr 24 21:44:40.283481 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:40.283135 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" event={"ID":"aefbeedc-4e73-4dd4-b28f-9c545cc9b622","Type":"ContainerDied","Data":"50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643"} Apr 24 21:44:41.289122 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:41.289088 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" event={"ID":"aefbeedc-4e73-4dd4-b28f-9c545cc9b622","Type":"ContainerStarted","Data":"70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072"} Apr 24 21:44:41.289122 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:41.289126 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" event={"ID":"aefbeedc-4e73-4dd4-b28f-9c545cc9b622","Type":"ContainerStarted","Data":"04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476"} Apr 24 21:44:41.289534 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:41.289230 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:41.316171 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:41.316120 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" podStartSLOduration=3.316101788 podStartE2EDuration="3.316101788s" podCreationTimestamp="2026-04-24 21:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:44:41.31439393 +0000 UTC m=+1005.917462888" watchObservedRunningTime="2026-04-24 21:44:41.316101788 +0000 UTC m=+1005.919170571" Apr 24 21:44:48.808955 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:48.808866 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:48.808955 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:48.808904 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:48.811469 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:48.811445 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:44:49.320117 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:44:49.320083 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:45:10.323861 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:10.323834 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:45:11.917323 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:11.917280 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf"] Apr 24 21:45:11.917833 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:11.917683 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="main" containerID="cri-o://04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476" gracePeriod=30 Apr 24 21:45:11.917900 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:11.917799 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="tokenizer" containerID="cri-o://70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072" gracePeriod=30 Apr 24 21:45:12.405046 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:12.405010 2568 generic.go:358] "Generic (PLEG): container finished" podID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerID="04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476" exitCode=0 Apr 24 21:45:12.405218 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:12.405089 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" event={"ID":"aefbeedc-4e73-4dd4-b28f-9c545cc9b622","Type":"ContainerDied","Data":"04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476"} Apr 24 21:45:13.060895 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.060875 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:45:13.149793 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.149721 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgnj2\" (UniqueName: \"kubernetes.io/projected/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kube-api-access-hgnj2\") pod \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " Apr 24 21:45:13.149793 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.149765 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tls-certs\") pod \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " Apr 24 21:45:13.149978 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.149813 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-cache\") pod \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " Apr 24 21:45:13.149978 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.149855 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-uds\") pod \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " Apr 24 21:45:13.149978 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.149882 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kserve-provision-location\") pod \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " Apr 24 21:45:13.149978 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.149947 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-tmp\") pod \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\" (UID: \"aefbeedc-4e73-4dd4-b28f-9c545cc9b622\") " Apr 24 21:45:13.150164 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.150077 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "aefbeedc-4e73-4dd4-b28f-9c545cc9b622" (UID: "aefbeedc-4e73-4dd4-b28f-9c545cc9b622"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:13.150213 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.150092 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "aefbeedc-4e73-4dd4-b28f-9c545cc9b622" (UID: "aefbeedc-4e73-4dd4-b28f-9c545cc9b622"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:13.150327 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.150307 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.150364 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.150315 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "aefbeedc-4e73-4dd4-b28f-9c545cc9b622" (UID: "aefbeedc-4e73-4dd4-b28f-9c545cc9b622"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:13.150364 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.150334 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.150643 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.150623 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aefbeedc-4e73-4dd4-b28f-9c545cc9b622" (UID: "aefbeedc-4e73-4dd4-b28f-9c545cc9b622"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:13.151838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.151817 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kube-api-access-hgnj2" (OuterVolumeSpecName: "kube-api-access-hgnj2") pod "aefbeedc-4e73-4dd4-b28f-9c545cc9b622" (UID: "aefbeedc-4e73-4dd4-b28f-9c545cc9b622"). InnerVolumeSpecName "kube-api-access-hgnj2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:13.151838 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.151827 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "aefbeedc-4e73-4dd4-b28f-9c545cc9b622" (UID: "aefbeedc-4e73-4dd4-b28f-9c545cc9b622"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:13.251397 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.251373 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.251397 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.251393 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hgnj2\" (UniqueName: \"kubernetes.io/projected/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kube-api-access-hgnj2\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.251523 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.251404 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.251523 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.251414 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aefbeedc-4e73-4dd4-b28f-9c545cc9b622-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.409817 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.409731 2568 generic.go:358] "Generic (PLEG): container finished" podID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerID="70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072" exitCode=0 Apr 24 21:45:13.409817 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.409781 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" event={"ID":"aefbeedc-4e73-4dd4-b28f-9c545cc9b622","Type":"ContainerDied","Data":"70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072"} Apr 24 21:45:13.409817 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.409811 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" Apr 24 21:45:13.410041 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.409828 2568 scope.go:117] "RemoveContainer" containerID="70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072" Apr 24 21:45:13.410041 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.409816 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf" event={"ID":"aefbeedc-4e73-4dd4-b28f-9c545cc9b622","Type":"ContainerDied","Data":"fb7573f9f934ef666824fd634e2840e09de25c04d2bf674f51eaa550e1965b12"} Apr 24 21:45:13.418280 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.418263 2568 scope.go:117] "RemoveContainer" containerID="04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476" Apr 24 21:45:13.425347 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.425330 2568 scope.go:117] "RemoveContainer" containerID="50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643" Apr 24 21:45:13.435579 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.433478 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf"] Apr 24 21:45:13.435668 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.435649 2568 scope.go:117] "RemoveContainer" containerID="70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072" Apr 24 21:45:13.436236 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:45:13.436180 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072\": container with ID starting with 70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072 not found: ID does not exist" containerID="70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072" Apr 24 21:45:13.436236 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.436214 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072"} err="failed to get container status \"70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072\": rpc error: code = NotFound desc = could not find container \"70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072\": container with ID starting with 70756c021887da4c12269a5d3d84d41eb3ccd351f75b7fe720e91b4836d18072 not found: ID does not exist" Apr 24 21:45:13.436236 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.436239 2568 scope.go:117] "RemoveContainer" containerID="04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476" Apr 24 21:45:13.436825 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:45:13.436804 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476\": container with ID starting with 04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476 not found: ID does not exist" containerID="04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476" Apr 24 21:45:13.436918 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.436834 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476"} err="failed to get container status \"04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476\": rpc error: code = NotFound desc = could not find container \"04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476\": container with ID starting with 04abbb57ee2ca8ea7b5744ea10a4b2ea7e0416d505f614fc0fab8da2bde8e476 not found: ID does not exist" Apr 24 21:45:13.436918 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.436856 2568 scope.go:117] "RemoveContainer" containerID="50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643" Apr 24 21:45:13.437165 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:45:13.437145 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643\": container with ID starting with 50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643 not found: ID does not exist" containerID="50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643" Apr 24 21:45:13.437237 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.437182 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643"} err="failed to get container status \"50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643\": rpc error: code = NotFound desc = could not find container \"50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643\": container with ID starting with 50acbe44fe72e75367818b7af3c8e16171f0ed3fef81f4e5e3fcf6d306391643 not found: ID does not exist" Apr 24 21:45:13.437966 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.437948 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc95ndvf"] Apr 24 21:45:13.917925 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:13.917890 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" path="/var/lib/kubelet/pods/aefbeedc-4e73-4dd4-b28f-9c545cc9b622/volumes" Apr 24 21:45:41.002075 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002042 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9"] Apr 24 21:45:41.002520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002402 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="storage-initializer" Apr 24 21:45:41.002520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002413 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="storage-initializer" Apr 24 21:45:41.002520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002421 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="main" Apr 24 21:45:41.002520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002426 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="main" Apr 24 21:45:41.002520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002443 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="tokenizer" Apr 24 21:45:41.002520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002448 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="tokenizer" Apr 24 21:45:41.002520 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002521 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="main" Apr 24 21:45:41.002779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.002533 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="aefbeedc-4e73-4dd4-b28f-9c545cc9b622" containerName="tokenizer" Apr 24 21:45:41.005699 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.005684 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.009324 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.009306 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 24 21:45:41.016022 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.015996 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9"] Apr 24 21:45:41.083455 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.083427 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp742\" (UniqueName: \"kubernetes.io/projected/9f9af30f-f919-4c71-844b-b15028cb429c-kube-api-access-tp742\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.083623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.083471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-tmp-dir\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.083623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.083518 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-home\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.083623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.083535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9af30f-f919-4c71-844b-b15028cb429c-tls-certs\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.083623 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.083560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-dshm\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.083849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.083661 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-model-cache\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.083849 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.083758 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.184782 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.184748 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tp742\" (UniqueName: \"kubernetes.io/projected/9f9af30f-f919-4c71-844b-b15028cb429c-kube-api-access-tp742\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.184946 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.184798 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-tmp-dir\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.184946 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.184857 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-home\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.184946 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.184881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9af30f-f919-4c71-844b-b15028cb429c-tls-certs\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.185123 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.184942 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-dshm\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.185123 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.185024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-model-cache\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.185123 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.185071 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.185273 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.185240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-home\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.185344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.185318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-tmp-dir\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.185427 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.185408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.185483 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.185416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-model-cache\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.187176 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.187157 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-dshm\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.187327 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.187310 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9af30f-f919-4c71-844b-b15028cb429c-tls-certs\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.193233 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.193207 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp742\" (UniqueName: \"kubernetes.io/projected/9f9af30f-f919-4c71-844b-b15028cb429c-kube-api-access-tp742\") pod \"precise-prefix-cache-test-kserve-659d8476f4-f8gx9\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.317845 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.317783 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:45:41.375585 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.375548 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2"] Apr 24 21:45:41.381995 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.381967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.384835 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.384809 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-7gz4b\"" Apr 24 21:45:41.391649 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.391617 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2"] Apr 24 21:45:41.457174 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.457130 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9"] Apr 24 21:45:41.460152 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:45:41.460124 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f9af30f_f919_4c71_844b_b15028cb429c.slice/crio-57f414c735829e2db45375ecbd8afbea13429a85c3dc989303332fde8b4389c0 WatchSource:0}: Error finding container 57f414c735829e2db45375ecbd8afbea13429a85c3dc989303332fde8b4389c0: Status 404 returned error can't find the container with id 57f414c735829e2db45375ecbd8afbea13429a85c3dc989303332fde8b4389c0 Apr 24 21:45:41.462234 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.462217 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:45:41.488100 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.488079 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.488182 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.488108 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkggk\" (UniqueName: \"kubernetes.io/projected/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kube-api-access-nkggk\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.488182 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.488154 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.488182 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.488171 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.488285 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.488192 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.488285 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.488235 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.503463 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.503436 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" event={"ID":"9f9af30f-f919-4c71-844b-b15028cb429c","Type":"ContainerStarted","Data":"57f414c735829e2db45375ecbd8afbea13429a85c3dc989303332fde8b4389c0"} Apr 24 21:45:41.589740 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.589644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.589740 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.589679 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkggk\" (UniqueName: \"kubernetes.io/projected/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kube-api-access-nkggk\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.590006 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.589759 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.590006 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.589776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.590006 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.589807 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.590006 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.589831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.590223 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.590065 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.590223 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.590089 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.590297 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.590220 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.590297 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.590232 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.592053 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.592037 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.598779 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.598759 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkggk\" (UniqueName: \"kubernetes.io/projected/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kube-api-access-nkggk\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.698943 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.698911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:41.823059 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:41.823033 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2"] Apr 24 21:45:41.825069 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:45:41.825042 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b341c90_1f13_48ed_a48e_65ed0ce1b187.slice/crio-8719bef761a1bf8fb30e3033ade7c3a06d49cc6f649c9c3ca2f19d03c2c6ceec WatchSource:0}: Error finding container 8719bef761a1bf8fb30e3033ade7c3a06d49cc6f649c9c3ca2f19d03c2c6ceec: Status 404 returned error can't find the container with id 8719bef761a1bf8fb30e3033ade7c3a06d49cc6f649c9c3ca2f19d03c2c6ceec Apr 24 21:45:42.508621 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:42.508579 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" event={"ID":"9f9af30f-f919-4c71-844b-b15028cb429c","Type":"ContainerStarted","Data":"22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01"} Apr 24 21:45:42.510101 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:42.510075 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" event={"ID":"8b341c90-1f13-48ed-a48e-65ed0ce1b187","Type":"ContainerStarted","Data":"5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d"} Apr 24 21:45:42.510212 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:42.510104 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" event={"ID":"8b341c90-1f13-48ed-a48e-65ed0ce1b187","Type":"ContainerStarted","Data":"8719bef761a1bf8fb30e3033ade7c3a06d49cc6f649c9c3ca2f19d03c2c6ceec"} Apr 24 21:45:43.515052 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:43.515010 2568 generic.go:358] "Generic (PLEG): container finished" podID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerID="5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d" exitCode=0 Apr 24 21:45:43.515436 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:43.515094 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" event={"ID":"8b341c90-1f13-48ed-a48e-65ed0ce1b187","Type":"ContainerDied","Data":"5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d"} Apr 24 21:45:44.521192 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:44.521152 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" event={"ID":"8b341c90-1f13-48ed-a48e-65ed0ce1b187","Type":"ContainerStarted","Data":"ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15"} Apr 24 21:45:44.521646 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:44.521200 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" event={"ID":"8b341c90-1f13-48ed-a48e-65ed0ce1b187","Type":"ContainerStarted","Data":"c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5"} Apr 24 21:45:44.521646 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:44.521245 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:44.543390 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:44.543342 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" podStartSLOduration=3.54332886 podStartE2EDuration="3.54332886s" podCreationTimestamp="2026-04-24 21:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:44.542262244 +0000 UTC m=+1069.145331026" watchObservedRunningTime="2026-04-24 21:45:44.54332886 +0000 UTC m=+1069.146397641" Apr 24 21:45:51.699548 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:51.699513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:51.699548 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:51.699548 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:51.700625 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:45:51.700604 2568 logging.go:55] [core] [Channel #209 SubChannel #210]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 24 21:45:51.701947 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:51.701928 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:52.554743 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:52.554699 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:45:52.700431 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:45:52.700390 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 24 21:46:01.700479 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:46:01.700443 2568 logging.go:55] [core] [Channel #217 SubChannel #218]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 24 21:46:02.700407 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:02.700360 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 24 21:46:13.559069 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:13.559037 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:46:18.729181 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:18.729145 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw"] Apr 24 21:46:18.729624 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:18.729469 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="main" containerID="cri-o://798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0" gracePeriod=30 Apr 24 21:46:18.729624 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:18.729519 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="tokenizer" containerID="cri-o://0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90" gracePeriod=30 Apr 24 21:46:19.651980 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.651943 2568 generic.go:358] "Generic (PLEG): container finished" podID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerID="798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0" exitCode=0 Apr 24 21:46:19.652215 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.652005 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" event={"ID":"dff58802-382a-4e33-ae62-fb58c287e7ad","Type":"ContainerDied","Data":"798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0"} Apr 24 21:46:19.871571 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.871548 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:46:19.931932 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.931871 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-tmp\") pod \"dff58802-382a-4e33-ae62-fb58c287e7ad\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " Apr 24 21:46:19.932052 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.931939 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-cache\") pod \"dff58802-382a-4e33-ae62-fb58c287e7ad\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " Apr 24 21:46:19.932052 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.931964 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-uds\") pod \"dff58802-382a-4e33-ae62-fb58c287e7ad\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " Apr 24 21:46:19.932052 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.931987 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbhnk\" (UniqueName: \"kubernetes.io/projected/dff58802-382a-4e33-ae62-fb58c287e7ad-kube-api-access-qbhnk\") pod \"dff58802-382a-4e33-ae62-fb58c287e7ad\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " Apr 24 21:46:19.932052 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932043 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dff58802-382a-4e33-ae62-fb58c287e7ad-tls-certs\") pod \"dff58802-382a-4e33-ae62-fb58c287e7ad\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " Apr 24 21:46:19.932238 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932084 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-kserve-provision-location\") pod \"dff58802-382a-4e33-ae62-fb58c287e7ad\" (UID: \"dff58802-382a-4e33-ae62-fb58c287e7ad\") " Apr 24 21:46:19.932238 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932204 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "dff58802-382a-4e33-ae62-fb58c287e7ad" (UID: "dff58802-382a-4e33-ae62-fb58c287e7ad"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:19.932238 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932225 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "dff58802-382a-4e33-ae62-fb58c287e7ad" (UID: "dff58802-382a-4e33-ae62-fb58c287e7ad"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:19.932390 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932238 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "dff58802-382a-4e33-ae62-fb58c287e7ad" (UID: "dff58802-382a-4e33-ae62-fb58c287e7ad"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:19.932390 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932387 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:46:19.932485 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932399 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:46:19.932485 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932408 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:46:19.932885 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.932857 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dff58802-382a-4e33-ae62-fb58c287e7ad" (UID: "dff58802-382a-4e33-ae62-fb58c287e7ad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:19.934232 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.934206 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dff58802-382a-4e33-ae62-fb58c287e7ad-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dff58802-382a-4e33-ae62-fb58c287e7ad" (UID: "dff58802-382a-4e33-ae62-fb58c287e7ad"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:19.934232 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:19.934208 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dff58802-382a-4e33-ae62-fb58c287e7ad-kube-api-access-qbhnk" (OuterVolumeSpecName: "kube-api-access-qbhnk") pod "dff58802-382a-4e33-ae62-fb58c287e7ad" (UID: "dff58802-382a-4e33-ae62-fb58c287e7ad"). InnerVolumeSpecName "kube-api-access-qbhnk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:20.032950 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.032918 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbhnk\" (UniqueName: \"kubernetes.io/projected/dff58802-382a-4e33-ae62-fb58c287e7ad-kube-api-access-qbhnk\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:46:20.032950 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.032947 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dff58802-382a-4e33-ae62-fb58c287e7ad-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:46:20.032950 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.032956 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dff58802-382a-4e33-ae62-fb58c287e7ad-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:46:20.657693 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.657660 2568 generic.go:358] "Generic (PLEG): container finished" podID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerID="0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90" exitCode=0 Apr 24 21:46:20.657874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.657696 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" event={"ID":"dff58802-382a-4e33-ae62-fb58c287e7ad","Type":"ContainerDied","Data":"0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90"} Apr 24 21:46:20.657874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.657737 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" event={"ID":"dff58802-382a-4e33-ae62-fb58c287e7ad","Type":"ContainerDied","Data":"db1cb12d64d78c8797d4f820fbc68ac7dec07ab917f7e92addde123eba596c4e"} Apr 24 21:46:20.657874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.657745 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw" Apr 24 21:46:20.657874 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.657755 2568 scope.go:117] "RemoveContainer" containerID="0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90" Apr 24 21:46:20.666464 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.666448 2568 scope.go:117] "RemoveContainer" containerID="798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0" Apr 24 21:46:20.673805 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.673786 2568 scope.go:117] "RemoveContainer" containerID="5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28" Apr 24 21:46:20.681195 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.681173 2568 scope.go:117] "RemoveContainer" containerID="0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90" Apr 24 21:46:20.681304 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.681236 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw"] Apr 24 21:46:20.681446 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:46:20.681429 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90\": container with ID starting with 0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90 not found: ID does not exist" containerID="0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90" Apr 24 21:46:20.681489 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.681455 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90"} err="failed to get container status \"0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90\": rpc error: code = NotFound desc = could not find container \"0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90\": container with ID starting with 0d1cd03e481e138c0b637106126953f60c887efc7b8d3a3b4dc702b5afaccd90 not found: ID does not exist" Apr 24 21:46:20.681489 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.681472 2568 scope.go:117] "RemoveContainer" containerID="798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0" Apr 24 21:46:20.681807 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:46:20.681691 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0\": container with ID starting with 798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0 not found: ID does not exist" containerID="798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0" Apr 24 21:46:20.681887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.681809 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0"} err="failed to get container status \"798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0\": rpc error: code = NotFound desc = could not find container \"798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0\": container with ID starting with 798ea94fd793e7c7e75e6c34de1a1865eaa12b1da8a973e5026e0c3ada9fbad0 not found: ID does not exist" Apr 24 21:46:20.681887 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.681835 2568 scope.go:117] "RemoveContainer" containerID="5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28" Apr 24 21:46:20.682080 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:46:20.682064 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28\": container with ID starting with 5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28 not found: ID does not exist" containerID="5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28" Apr 24 21:46:20.682123 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.682085 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28"} err="failed to get container status \"5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28\": rpc error: code = NotFound desc = could not find container \"5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28\": container with ID starting with 5ba5588ba772c96ffe1a5027e7d1ac065008a58712de988945c634e412640b28 not found: ID does not exist" Apr 24 21:46:20.687930 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:20.687908 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schex6ctw"] Apr 24 21:46:21.917915 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:21.917876 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" path="/var/lib/kubelet/pods/dff58802-382a-4e33-ae62-fb58c287e7ad/volumes" Apr 24 21:46:39.730022 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:39.729989 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f9af30f-f919-4c71-844b-b15028cb429c" containerID="22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01" exitCode=0 Apr 24 21:46:39.730335 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:39.730048 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" event={"ID":"9f9af30f-f919-4c71-844b-b15028cb429c","Type":"ContainerDied","Data":"22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01"} Apr 24 21:46:41.738812 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:41.738776 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" event={"ID":"9f9af30f-f919-4c71-844b-b15028cb429c","Type":"ContainerStarted","Data":"c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de"} Apr 24 21:46:41.763011 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:41.762967 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" podStartSLOduration=60.738967984 podStartE2EDuration="1m1.762953526s" podCreationTimestamp="2026-04-24 21:45:40 +0000 UTC" firstStartedPulling="2026-04-24 21:46:39.731123905 +0000 UTC m=+1124.334192664" lastFinishedPulling="2026-04-24 21:46:40.755109444 +0000 UTC m=+1125.358178206" observedRunningTime="2026-04-24 21:46:41.761461795 +0000 UTC m=+1126.364530575" watchObservedRunningTime="2026-04-24 21:46:41.762953526 +0000 UTC m=+1126.366022307" Apr 24 21:46:51.318169 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:51.318129 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:46:51.318169 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:51.318178 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:46:51.330579 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:51.330558 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:46:51.788885 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:46:51.788850 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:47:55.879638 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:47:55.879608 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:47:55.881890 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:47:55.881869 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:48:03.436020 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.435980 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2"] Apr 24 21:48:03.436489 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.436396 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="main" containerID="cri-o://c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5" gracePeriod=30 Apr 24 21:48:03.436555 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.436498 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="tokenizer" containerID="cri-o://ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15" gracePeriod=30 Apr 24 21:48:03.448283 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.448244 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9"] Apr 24 21:48:03.448567 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.448541 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" podUID="9f9af30f-f919-4c71-844b-b15028cb429c" containerName="main" containerID="cri-o://c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de" gracePeriod=30 Apr 24 21:48:03.558183 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:48:03.558157 2568 logging.go:55] [core] [Channel #300 SubChannel #301]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 24 21:48:03.703835 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.703811 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:48:03.833648 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.833614 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp742\" (UniqueName: \"kubernetes.io/projected/9f9af30f-f919-4c71-844b-b15028cb429c-kube-api-access-tp742\") pod \"9f9af30f-f919-4c71-844b-b15028cb429c\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " Apr 24 21:48:03.833845 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.833752 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-home\") pod \"9f9af30f-f919-4c71-844b-b15028cb429c\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " Apr 24 21:48:03.833845 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.833782 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9af30f-f919-4c71-844b-b15028cb429c-tls-certs\") pod \"9f9af30f-f919-4c71-844b-b15028cb429c\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " Apr 24 21:48:03.833845 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.833804 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-model-cache\") pod \"9f9af30f-f919-4c71-844b-b15028cb429c\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " Apr 24 21:48:03.833845 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.833834 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-kserve-provision-location\") pod \"9f9af30f-f919-4c71-844b-b15028cb429c\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " Apr 24 21:48:03.834062 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.834020 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-model-cache" (OuterVolumeSpecName: "model-cache") pod "9f9af30f-f919-4c71-844b-b15028cb429c" (UID: "9f9af30f-f919-4c71-844b-b15028cb429c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:03.834062 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.834031 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-home" (OuterVolumeSpecName: "home") pod "9f9af30f-f919-4c71-844b-b15028cb429c" (UID: "9f9af30f-f919-4c71-844b-b15028cb429c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:03.835118 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.835097 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-tmp-dir\") pod \"9f9af30f-f919-4c71-844b-b15028cb429c\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " Apr 24 21:48:03.835573 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.835451 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "9f9af30f-f919-4c71-844b-b15028cb429c" (UID: "9f9af30f-f919-4c71-844b-b15028cb429c"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:03.835777 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.835761 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-dshm\") pod \"9f9af30f-f919-4c71-844b-b15028cb429c\" (UID: \"9f9af30f-f919-4c71-844b-b15028cb429c\") " Apr 24 21:48:03.836125 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.836101 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9af30f-f919-4c71-844b-b15028cb429c-kube-api-access-tp742" (OuterVolumeSpecName: "kube-api-access-tp742") pod "9f9af30f-f919-4c71-844b-b15028cb429c" (UID: "9f9af30f-f919-4c71-844b-b15028cb429c"). InnerVolumeSpecName "kube-api-access-tp742". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:03.836313 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.836295 2568 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-tmp-dir\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:03.836394 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.836348 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tp742\" (UniqueName: \"kubernetes.io/projected/9f9af30f-f919-4c71-844b-b15028cb429c-kube-api-access-tp742\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:03.836394 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.836366 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-home\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:03.836394 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.836380 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-model-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:03.837563 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.837535 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9af30f-f919-4c71-844b-b15028cb429c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9f9af30f-f919-4c71-844b-b15028cb429c" (UID: "9f9af30f-f919-4c71-844b-b15028cb429c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:03.838642 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.838619 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-dshm" (OuterVolumeSpecName: "dshm") pod "9f9af30f-f919-4c71-844b-b15028cb429c" (UID: "9f9af30f-f919-4c71-844b-b15028cb429c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:03.890354 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.890306 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f9af30f-f919-4c71-844b-b15028cb429c" (UID: "9f9af30f-f919-4c71-844b-b15028cb429c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:03.937328 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.937249 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-dshm\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:03.937328 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.937303 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f9af30f-f919-4c71-844b-b15028cb429c-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:03.937328 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:03.937319 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f9af30f-f919-4c71-844b-b15028cb429c-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:04.026359 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.026321 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f9af30f-f919-4c71-844b-b15028cb429c" containerID="c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de" exitCode=0 Apr 24 21:48:04.026540 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.026404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" event={"ID":"9f9af30f-f919-4c71-844b-b15028cb429c","Type":"ContainerDied","Data":"c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de"} Apr 24 21:48:04.026540 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.026425 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" Apr 24 21:48:04.026540 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.026452 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9" event={"ID":"9f9af30f-f919-4c71-844b-b15028cb429c","Type":"ContainerDied","Data":"57f414c735829e2db45375ecbd8afbea13429a85c3dc989303332fde8b4389c0"} Apr 24 21:48:04.026540 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.026474 2568 scope.go:117] "RemoveContainer" containerID="c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de" Apr 24 21:48:04.028746 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.028592 2568 generic.go:358] "Generic (PLEG): container finished" podID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerID="c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5" exitCode=0 Apr 24 21:48:04.028746 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.028669 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" event={"ID":"8b341c90-1f13-48ed-a48e-65ed0ce1b187","Type":"ContainerDied","Data":"c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5"} Apr 24 21:48:04.035684 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.035642 2568 scope.go:117] "RemoveContainer" containerID="22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01" Apr 24 21:48:04.045606 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.045582 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9"] Apr 24 21:48:04.049815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.049793 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-659d8476f4-f8gx9"] Apr 24 21:48:04.098429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.098409 2568 scope.go:117] "RemoveContainer" containerID="c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de" Apr 24 21:48:04.098719 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:48:04.098688 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de\": container with ID starting with c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de not found: ID does not exist" containerID="c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de" Apr 24 21:48:04.098784 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.098737 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de"} err="failed to get container status \"c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de\": rpc error: code = NotFound desc = could not find container \"c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de\": container with ID starting with c76af957f9fcde30421249d2434eb1f92f1409ae09bb64a323a8b60f9b17f9de not found: ID does not exist" Apr 24 21:48:04.098784 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.098756 2568 scope.go:117] "RemoveContainer" containerID="22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01" Apr 24 21:48:04.099033 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:48:04.099018 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01\": container with ID starting with 22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01 not found: ID does not exist" containerID="22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01" Apr 24 21:48:04.099075 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.099035 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01"} err="failed to get container status \"22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01\": rpc error: code = NotFound desc = could not find container \"22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01\": container with ID starting with 22e00b90d4320b3bcc60b4c47fa273a13880752fdfec91385d848fcca9f4da01 not found: ID does not exist" Apr 24 21:48:04.558839 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.558796 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 24 21:48:04.891579 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:04.891556 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:48:05.034902 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.034864 2568 generic.go:358] "Generic (PLEG): container finished" podID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerID="ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15" exitCode=0 Apr 24 21:48:05.035081 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.034947 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" Apr 24 21:48:05.035081 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.034934 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" event={"ID":"8b341c90-1f13-48ed-a48e-65ed0ce1b187","Type":"ContainerDied","Data":"ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15"} Apr 24 21:48:05.035081 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.034988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2" event={"ID":"8b341c90-1f13-48ed-a48e-65ed0ce1b187","Type":"ContainerDied","Data":"8719bef761a1bf8fb30e3033ade7c3a06d49cc6f649c9c3ca2f19d03c2c6ceec"} Apr 24 21:48:05.035081 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.035008 2568 scope.go:117] "RemoveContainer" containerID="ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15" Apr 24 21:48:05.043367 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.043339 2568 scope.go:117] "RemoveContainer" containerID="c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5" Apr 24 21:48:05.048498 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.048475 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-uds\") pod \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " Apr 24 21:48:05.048782 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.048578 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kserve-provision-location\") pod \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " Apr 24 21:48:05.048782 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.048608 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-tmp\") pod \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " Apr 24 21:48:05.048782 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.048636 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tls-certs\") pod \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " Apr 24 21:48:05.048782 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.048672 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-cache\") pod \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " Apr 24 21:48:05.048782 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.048775 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8b341c90-1f13-48ed-a48e-65ed0ce1b187" (UID: "8b341c90-1f13-48ed-a48e-65ed0ce1b187"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:05.049104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.048809 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkggk\" (UniqueName: \"kubernetes.io/projected/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kube-api-access-nkggk\") pod \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\" (UID: \"8b341c90-1f13-48ed-a48e-65ed0ce1b187\") " Apr 24 21:48:05.049170 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.049152 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8b341c90-1f13-48ed-a48e-65ed0ce1b187" (UID: "8b341c90-1f13-48ed-a48e-65ed0ce1b187"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:05.049781 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.049336 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:05.049781 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.049361 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:05.049781 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.049456 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8b341c90-1f13-48ed-a48e-65ed0ce1b187" (UID: "8b341c90-1f13-48ed-a48e-65ed0ce1b187"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:05.049781 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.049740 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8b341c90-1f13-48ed-a48e-65ed0ce1b187" (UID: "8b341c90-1f13-48ed-a48e-65ed0ce1b187"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:05.051056 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.050976 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8b341c90-1f13-48ed-a48e-65ed0ce1b187" (UID: "8b341c90-1f13-48ed-a48e-65ed0ce1b187"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:05.051134 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.051080 2568 scope.go:117] "RemoveContainer" containerID="5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d" Apr 24 21:48:05.051392 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.051370 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kube-api-access-nkggk" (OuterVolumeSpecName: "kube-api-access-nkggk") pod "8b341c90-1f13-48ed-a48e-65ed0ce1b187" (UID: "8b341c90-1f13-48ed-a48e-65ed0ce1b187"). InnerVolumeSpecName "kube-api-access-nkggk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:05.061237 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.061223 2568 scope.go:117] "RemoveContainer" containerID="ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15" Apr 24 21:48:05.061467 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:48:05.061450 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15\": container with ID starting with ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15 not found: ID does not exist" containerID="ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15" Apr 24 21:48:05.061513 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.061473 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15"} err="failed to get container status \"ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15\": rpc error: code = NotFound desc = could not find container \"ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15\": container with ID starting with ed7ae40ff89b28673e78786b98f6c28f0a59a1a6769fcbd200b771cd8a22fb15 not found: ID does not exist" Apr 24 21:48:05.061513 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.061491 2568 scope.go:117] "RemoveContainer" containerID="c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5" Apr 24 21:48:05.061724 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:48:05.061694 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5\": container with ID starting with c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5 not found: ID does not exist" containerID="c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5" Apr 24 21:48:05.061767 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.061735 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5"} err="failed to get container status \"c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5\": rpc error: code = NotFound desc = could not find container \"c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5\": container with ID starting with c76963d22646d292eec26a1e5d370634e9ba8222a9c0bc504a6e736d527073f5 not found: ID does not exist" Apr 24 21:48:05.061767 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.061752 2568 scope.go:117] "RemoveContainer" containerID="5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d" Apr 24 21:48:05.061974 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:48:05.061957 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d\": container with ID starting with 5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d not found: ID does not exist" containerID="5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d" Apr 24 21:48:05.062013 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.061979 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d"} err="failed to get container status \"5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d\": rpc error: code = NotFound desc = could not find container \"5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d\": container with ID starting with 5c464f90a3cebdbe924f2f5db439b5b62ea1a25777acf11f794c64d21c027f4d not found: ID does not exist" Apr 24 21:48:05.150629 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.150559 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:05.150629 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.150587 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:05.150629 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.150599 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8b341c90-1f13-48ed-a48e-65ed0ce1b187-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:05.150629 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.150610 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkggk\" (UniqueName: \"kubernetes.io/projected/8b341c90-1f13-48ed-a48e-65ed0ce1b187-kube-api-access-nkggk\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:48:05.357561 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.357532 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2"] Apr 24 21:48:05.362134 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.362109 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6fc856f85q2r2"] Apr 24 21:48:05.917991 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.917960 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" path="/var/lib/kubelet/pods/8b341c90-1f13-48ed-a48e-65ed0ce1b187/volumes" Apr 24 21:48:05.918432 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:05.918417 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9af30f-f919-4c71-844b-b15028cb429c" path="/var/lib/kubelet/pods/9f9af30f-f919-4c71-844b-b15028cb429c/volumes" Apr 24 21:48:17.691386 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.691354 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8"] Apr 24 21:48:17.691941 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.691919 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="tokenizer" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.691944 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="tokenizer" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.691964 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="main" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.691972 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="main" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.691981 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f9af30f-f919-4c71-844b-b15028cb429c" containerName="main" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.691989 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9af30f-f919-4c71-844b-b15028cb429c" containerName="main" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692003 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="storage-initializer" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692010 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="storage-initializer" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692023 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="storage-initializer" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692030 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="storage-initializer" Apr 24 21:48:17.692037 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692038 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="main" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692045 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="main" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692055 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="tokenizer" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692062 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="tokenizer" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692072 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f9af30f-f919-4c71-844b-b15028cb429c" containerName="storage-initializer" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692080 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9af30f-f919-4c71-844b-b15028cb429c" containerName="storage-initializer" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692163 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="tokenizer" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692175 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f9af30f-f919-4c71-844b-b15028cb429c" containerName="main" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692188 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b341c90-1f13-48ed-a48e-65ed0ce1b187" containerName="main" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692196 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="tokenizer" Apr 24 21:48:17.692421 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.692206 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="dff58802-382a-4e33-ae62-fb58c287e7ad" containerName="main" Apr 24 21:48:17.696877 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.696857 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.699433 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.699406 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-7xndx\"" Apr 24 21:48:17.699557 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.699464 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:48:17.699557 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.699544 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:48:17.699982 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.699959 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 21:48:17.700067 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.700005 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:48:17.706020 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.705999 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8"] Apr 24 21:48:17.852863 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.852832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.853015 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.852869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.853015 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.852967 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.853015 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.852997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.853128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.853068 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzlfl\" (UniqueName: \"kubernetes.io/projected/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kube-api-access-qzlfl\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.853128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.853102 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954146 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954070 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954146 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954146 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzlfl\" (UniqueName: \"kubernetes.io/projected/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kube-api-access-qzlfl\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954407 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954407 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954195 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954407 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954216 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954582 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954582 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954496 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954582 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954568 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.954676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.954585 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.956667 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.956650 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:17.963266 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:17.963243 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzlfl\" (UniqueName: \"kubernetes.io/projected/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kube-api-access-qzlfl\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:18.008241 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:18.008216 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:18.133225 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:18.133200 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8"] Apr 24 21:48:18.136537 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:48:18.136501 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d26d31c_278e_4c10_b1a2_11bf3fee1bd3.slice/crio-b302bdfc0b5ca180ea4e31cb21d7c9c0bfd3e8006a278410018464c5adc05733 WatchSource:0}: Error finding container b302bdfc0b5ca180ea4e31cb21d7c9c0bfd3e8006a278410018464c5adc05733: Status 404 returned error can't find the container with id b302bdfc0b5ca180ea4e31cb21d7c9c0bfd3e8006a278410018464c5adc05733 Apr 24 21:48:19.079834 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:19.079801 2568 generic.go:358] "Generic (PLEG): container finished" podID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerID="fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee" exitCode=0 Apr 24 21:48:19.080181 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:19.079846 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" event={"ID":"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3","Type":"ContainerDied","Data":"fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee"} Apr 24 21:48:19.080181 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:19.079877 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" event={"ID":"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3","Type":"ContainerStarted","Data":"b302bdfc0b5ca180ea4e31cb21d7c9c0bfd3e8006a278410018464c5adc05733"} Apr 24 21:48:20.086133 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:20.086103 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" event={"ID":"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3","Type":"ContainerStarted","Data":"54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c"} Apr 24 21:48:20.086133 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:20.086136 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" event={"ID":"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3","Type":"ContainerStarted","Data":"86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d"} Apr 24 21:48:20.086661 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:20.086223 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:20.110291 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:20.110236 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" podStartSLOduration=3.110221011 podStartE2EDuration="3.110221011s" podCreationTimestamp="2026-04-24 21:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:20.10818461 +0000 UTC m=+1224.711253417" watchObservedRunningTime="2026-04-24 21:48:20.110221011 +0000 UTC m=+1224.713289792" Apr 24 21:48:28.008398 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:28.008366 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:28.008803 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:28.008515 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:28.011095 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:28.011075 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:28.114982 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:28.114957 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:48:50.123514 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:48:50.123484 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:49:51.385668 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.385635 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt"] Apr 24 21:49:51.390082 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.390064 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.392798 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.392775 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 24 21:49:51.396399 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.396377 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-6gqvf\"" Apr 24 21:49:51.403389 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.403362 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt"] Apr 24 21:49:51.458058 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.458016 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.458058 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.458052 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtl8q\" (UniqueName: \"kubernetes.io/projected/6a55fefe-543b-4e58-b85c-9b2743549858-kube-api-access-mtl8q\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.458232 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.458071 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55fefe-543b-4e58-b85c-9b2743549858-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.458232 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.458139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.458232 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.458171 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.458232 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.458210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559296 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559444 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559444 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559531 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559519 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559572 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559542 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtl8q\" (UniqueName: \"kubernetes.io/projected/6a55fefe-543b-4e58-b85c-9b2743549858-kube-api-access-mtl8q\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559572 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559563 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55fefe-543b-4e58-b85c-9b2743549858-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559662 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559744 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559686 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559806 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559762 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.559861 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.559827 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.561920 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.561898 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55fefe-543b-4e58-b85c-9b2743549858-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.569497 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.569466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtl8q\" (UniqueName: \"kubernetes.io/projected/6a55fefe-543b-4e58-b85c-9b2743549858-kube-api-access-mtl8q\") pod \"router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.699626 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.699559 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:51.826658 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:51.826631 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt"] Apr 24 21:49:51.828511 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:49:51.828484 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a55fefe_543b_4e58_b85c_9b2743549858.slice/crio-806968ec7324bcba6b687a374bd01958aa661ac599c8ded0f3682d4ff22dfd05 WatchSource:0}: Error finding container 806968ec7324bcba6b687a374bd01958aa661ac599c8ded0f3682d4ff22dfd05: Status 404 returned error can't find the container with id 806968ec7324bcba6b687a374bd01958aa661ac599c8ded0f3682d4ff22dfd05 Apr 24 21:49:52.389276 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:52.389236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" event={"ID":"6a55fefe-543b-4e58-b85c-9b2743549858","Type":"ContainerStarted","Data":"dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd"} Apr 24 21:49:52.389276 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:52.389280 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" event={"ID":"6a55fefe-543b-4e58-b85c-9b2743549858","Type":"ContainerStarted","Data":"806968ec7324bcba6b687a374bd01958aa661ac599c8ded0f3682d4ff22dfd05"} Apr 24 21:49:53.393772 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:53.393734 2568 generic.go:358] "Generic (PLEG): container finished" podID="6a55fefe-543b-4e58-b85c-9b2743549858" containerID="dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd" exitCode=0 Apr 24 21:49:53.394137 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:53.393809 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" event={"ID":"6a55fefe-543b-4e58-b85c-9b2743549858","Type":"ContainerDied","Data":"dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd"} Apr 24 21:49:54.399477 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:54.399445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" event={"ID":"6a55fefe-543b-4e58-b85c-9b2743549858","Type":"ContainerStarted","Data":"bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2"} Apr 24 21:49:54.399477 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:54.399479 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" event={"ID":"6a55fefe-543b-4e58-b85c-9b2743549858","Type":"ContainerStarted","Data":"2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7"} Apr 24 21:49:54.399969 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:54.399582 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:49:54.427515 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:49:54.426787 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" podStartSLOduration=3.426770967 podStartE2EDuration="3.426770967s" podCreationTimestamp="2026-04-24 21:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:54.424477265 +0000 UTC m=+1319.027546071" watchObservedRunningTime="2026-04-24 21:49:54.426770967 +0000 UTC m=+1319.029839749" Apr 24 21:50:01.700778 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:01.700738 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:50:01.700778 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:01.700784 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:50:01.703358 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:01.703335 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:50:02.429120 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:02.429088 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:50:08.775011 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:08.774977 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8"] Apr 24 21:50:08.775405 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:08.775367 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="main" containerID="cri-o://86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d" gracePeriod=30 Apr 24 21:50:08.775469 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:08.775433 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="tokenizer" containerID="cri-o://54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c" gracePeriod=30 Apr 24 21:50:09.459231 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:09.459196 2568 generic.go:358] "Generic (PLEG): container finished" podID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerID="86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d" exitCode=0 Apr 24 21:50:09.459408 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:09.459280 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" event={"ID":"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3","Type":"ContainerDied","Data":"86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d"} Apr 24 21:50:09.929832 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:09.929810 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:50:10.018097 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018027 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-tmp\") pod \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " Apr 24 21:50:10.018097 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018067 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-cache\") pod \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " Apr 24 21:50:10.018097 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018091 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzlfl\" (UniqueName: \"kubernetes.io/projected/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kube-api-access-qzlfl\") pod \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " Apr 24 21:50:10.018348 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018152 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tls-certs\") pod \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " Apr 24 21:50:10.018348 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018185 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kserve-provision-location\") pod \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " Apr 24 21:50:10.018348 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018219 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-uds\") pod \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\" (UID: \"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3\") " Apr 24 21:50:10.018348 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018279 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" (UID: "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:10.018569 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018368 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" (UID: "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:10.018569 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018470 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:50:10.018569 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018487 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:50:10.018569 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018562 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" (UID: "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:10.018950 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.018929 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" (UID: "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:10.020196 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.020173 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" (UID: "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:10.020260 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.020221 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kube-api-access-qzlfl" (OuterVolumeSpecName: "kube-api-access-qzlfl") pod "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" (UID: "9d26d31c-278e-4c10-b1a2-11bf3fee1bd3"). InnerVolumeSpecName "kube-api-access-qzlfl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:10.119280 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.119244 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzlfl\" (UniqueName: \"kubernetes.io/projected/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kube-api-access-qzlfl\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:50:10.119280 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.119269 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:50:10.119280 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.119279 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:50:10.119280 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.119288 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:50:10.464148 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.464061 2568 generic.go:358] "Generic (PLEG): container finished" podID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerID="54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c" exitCode=0 Apr 24 21:50:10.464148 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.464105 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" event={"ID":"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3","Type":"ContainerDied","Data":"54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c"} Apr 24 21:50:10.464148 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.464139 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" Apr 24 21:50:10.464401 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.464152 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8" event={"ID":"9d26d31c-278e-4c10-b1a2-11bf3fee1bd3","Type":"ContainerDied","Data":"b302bdfc0b5ca180ea4e31cb21d7c9c0bfd3e8006a278410018464c5adc05733"} Apr 24 21:50:10.464401 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.464173 2568 scope.go:117] "RemoveContainer" containerID="54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c" Apr 24 21:50:10.473069 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.473053 2568 scope.go:117] "RemoveContainer" containerID="86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d" Apr 24 21:50:10.480254 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.480238 2568 scope.go:117] "RemoveContainer" containerID="fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee" Apr 24 21:50:10.487443 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.487423 2568 scope.go:117] "RemoveContainer" containerID="54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c" Apr 24 21:50:10.487720 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:50:10.487690 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c\": container with ID starting with 54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c not found: ID does not exist" containerID="54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c" Apr 24 21:50:10.487802 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.487729 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c"} err="failed to get container status \"54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c\": rpc error: code = NotFound desc = could not find container \"54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c\": container with ID starting with 54723e91163dee7dd09c573242eed68be2f4351a59cb545c5e27072805daec8c not found: ID does not exist" Apr 24 21:50:10.487802 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.487747 2568 scope.go:117] "RemoveContainer" containerID="86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d" Apr 24 21:50:10.487931 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.487913 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8"] Apr 24 21:50:10.487979 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:50:10.487949 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d\": container with ID starting with 86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d not found: ID does not exist" containerID="86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d" Apr 24 21:50:10.487979 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.487966 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d"} err="failed to get container status \"86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d\": rpc error: code = NotFound desc = could not find container \"86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d\": container with ID starting with 86b8c0db0b917b57197cee5921a76aab3d60194ba719ec92d05cca194078bd8d not found: ID does not exist" Apr 24 21:50:10.488062 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.487980 2568 scope.go:117] "RemoveContainer" containerID="fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee" Apr 24 21:50:10.488204 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:50:10.488191 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee\": container with ID starting with fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee not found: ID does not exist" containerID="fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee" Apr 24 21:50:10.488241 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.488206 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee"} err="failed to get container status \"fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee\": rpc error: code = NotFound desc = could not find container \"fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee\": container with ID starting with fbb36931daa86099c0adadd0e18ef7260c5aec7c5143ede1b6ee2dc92aa34bee not found: ID does not exist" Apr 24 21:50:10.492455 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:10.492436 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-zlsr8"] Apr 24 21:50:11.916457 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:11.916424 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" path="/var/lib/kubelet/pods/9d26d31c-278e-4c10-b1a2-11bf3fee1bd3/volumes" Apr 24 21:50:23.432806 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:23.432729 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:50:25.595773 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.595739 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb"] Apr 24 21:50:25.596313 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.596288 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="main" Apr 24 21:50:25.596313 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.596314 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="main" Apr 24 21:50:25.596519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.596348 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="storage-initializer" Apr 24 21:50:25.596519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.596367 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="storage-initializer" Apr 24 21:50:25.596519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.596386 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="tokenizer" Apr 24 21:50:25.596519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.596395 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="tokenizer" Apr 24 21:50:25.596519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.596492 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="tokenizer" Apr 24 21:50:25.596519 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.596504 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d26d31c-278e-4c10-b1a2-11bf3fee1bd3" containerName="main" Apr 24 21:50:25.599493 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.599470 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.601871 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.601847 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-jxt2k\"" Apr 24 21:50:25.601999 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.601859 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 21:50:25.613005 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.612981 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb"] Apr 24 21:50:25.645401 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.645372 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.645543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.645427 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.645543 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.645506 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.645631 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.645541 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvsz\" (UniqueName: \"kubernetes.io/projected/6dd9794c-99c2-475c-a703-764c9e74f02b-kube-api-access-4vvsz\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.645631 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.645570 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd9794c-99c2-475c-a703-764c9e74f02b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.645631 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.645598 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.746943 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.746909 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747090 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.746956 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747090 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.746988 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747090 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.747006 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvsz\" (UniqueName: \"kubernetes.io/projected/6dd9794c-99c2-475c-a703-764c9e74f02b-kube-api-access-4vvsz\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747090 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.747031 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd9794c-99c2-475c-a703-764c9e74f02b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747090 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.747054 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747341 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.747313 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747341 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.747337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747438 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.747381 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.747438 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.747410 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.749629 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.749608 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd9794c-99c2-475c-a703-764c9e74f02b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.756150 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.756120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvsz\" (UniqueName: \"kubernetes.io/projected/6dd9794c-99c2-475c-a703-764c9e74f02b-kube-api-access-4vvsz\") pod \"stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:25.910248 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:25.910180 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:26.038081 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:26.038012 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb"] Apr 24 21:50:26.040676 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:50:26.040645 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd9794c_99c2_475c_a703_764c9e74f02b.slice/crio-29fc60fad2804b3caba5c0f0be9cd8767fff4dd760c8c310ed51b117cb27b584 WatchSource:0}: Error finding container 29fc60fad2804b3caba5c0f0be9cd8767fff4dd760c8c310ed51b117cb27b584: Status 404 returned error can't find the container with id 29fc60fad2804b3caba5c0f0be9cd8767fff4dd760c8c310ed51b117cb27b584 Apr 24 21:50:26.520417 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:26.520379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" event={"ID":"6dd9794c-99c2-475c-a703-764c9e74f02b","Type":"ContainerStarted","Data":"248b1daadaa8c6fb96387cc4f6a554ea232e485415442f00d91a294c930c1e8b"} Apr 24 21:50:26.520417 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:26.520417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" event={"ID":"6dd9794c-99c2-475c-a703-764c9e74f02b","Type":"ContainerStarted","Data":"29fc60fad2804b3caba5c0f0be9cd8767fff4dd760c8c310ed51b117cb27b584"} Apr 24 21:50:27.525477 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:27.525442 2568 generic.go:358] "Generic (PLEG): container finished" podID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerID="248b1daadaa8c6fb96387cc4f6a554ea232e485415442f00d91a294c930c1e8b" exitCode=0 Apr 24 21:50:27.525836 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:27.525483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" event={"ID":"6dd9794c-99c2-475c-a703-764c9e74f02b","Type":"ContainerDied","Data":"248b1daadaa8c6fb96387cc4f6a554ea232e485415442f00d91a294c930c1e8b"} Apr 24 21:50:28.531823 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:28.531786 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" event={"ID":"6dd9794c-99c2-475c-a703-764c9e74f02b","Type":"ContainerStarted","Data":"b8bab41078dc0fdb37da4ff2f8f8918544e64f7f9da5c048414a8f361c105472"} Apr 24 21:50:28.531823 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:28.531821 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" event={"ID":"6dd9794c-99c2-475c-a703-764c9e74f02b","Type":"ContainerStarted","Data":"15a6d599d053210761955df04f2e6e9e950a4f6fbb7e85f7ece69cbcd5fe2c89"} Apr 24 21:50:28.532226 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:28.531939 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:28.555152 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:28.555098 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" podStartSLOduration=3.555079075 podStartE2EDuration="3.555079075s" podCreationTimestamp="2026-04-24 21:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:28.551701261 +0000 UTC m=+1353.154770056" watchObservedRunningTime="2026-04-24 21:50:28.555079075 +0000 UTC m=+1353.158147857" Apr 24 21:50:35.911394 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:35.911349 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:35.911394 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:35.911401 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:35.916724 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:35.916691 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:36.564785 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:36.564749 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:50:57.569495 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:50:57.569463 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:52:10.251735 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:10.251679 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt"] Apr 24 21:52:10.252165 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:10.252046 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="main" containerID="cri-o://2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7" gracePeriod=30 Apr 24 21:52:10.252165 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:10.252107 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="tokenizer" containerID="cri-o://bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2" gracePeriod=30 Apr 24 21:52:10.888026 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:10.887989 2568 generic.go:358] "Generic (PLEG): container finished" podID="6a55fefe-543b-4e58-b85c-9b2743549858" containerID="2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7" exitCode=0 Apr 24 21:52:10.888191 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:10.888056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" event={"ID":"6a55fefe-543b-4e58-b85c-9b2743549858","Type":"ContainerDied","Data":"2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7"} Apr 24 21:52:11.397333 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.397312 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:52:11.467298 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467223 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtl8q\" (UniqueName: \"kubernetes.io/projected/6a55fefe-543b-4e58-b85c-9b2743549858-kube-api-access-mtl8q\") pod \"6a55fefe-543b-4e58-b85c-9b2743549858\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " Apr 24 21:52:11.467298 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467257 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-cache\") pod \"6a55fefe-543b-4e58-b85c-9b2743549858\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " Apr 24 21:52:11.467298 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467273 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-uds\") pod \"6a55fefe-543b-4e58-b85c-9b2743549858\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " Apr 24 21:52:11.467563 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467314 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55fefe-543b-4e58-b85c-9b2743549858-tls-certs\") pod \"6a55fefe-543b-4e58-b85c-9b2743549858\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " Apr 24 21:52:11.467563 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467338 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-tmp\") pod \"6a55fefe-543b-4e58-b85c-9b2743549858\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " Apr 24 21:52:11.467563 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467368 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-kserve-provision-location\") pod \"6a55fefe-543b-4e58-b85c-9b2743549858\" (UID: \"6a55fefe-543b-4e58-b85c-9b2743549858\") " Apr 24 21:52:11.467730 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467570 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6a55fefe-543b-4e58-b85c-9b2743549858" (UID: "6a55fefe-543b-4e58-b85c-9b2743549858"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:11.467730 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467661 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6a55fefe-543b-4e58-b85c-9b2743549858" (UID: "6a55fefe-543b-4e58-b85c-9b2743549858"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:11.467730 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467693 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:11.467856 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.467777 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6a55fefe-543b-4e58-b85c-9b2743549858" (UID: "6a55fefe-543b-4e58-b85c-9b2743549858"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:11.468119 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.468098 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a55fefe-543b-4e58-b85c-9b2743549858" (UID: "6a55fefe-543b-4e58-b85c-9b2743549858"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:11.469430 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.469407 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a55fefe-543b-4e58-b85c-9b2743549858-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6a55fefe-543b-4e58-b85c-9b2743549858" (UID: "6a55fefe-543b-4e58-b85c-9b2743549858"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:11.469504 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.469436 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a55fefe-543b-4e58-b85c-9b2743549858-kube-api-access-mtl8q" (OuterVolumeSpecName: "kube-api-access-mtl8q") pod "6a55fefe-543b-4e58-b85c-9b2743549858" (UID: "6a55fefe-543b-4e58-b85c-9b2743549858"). InnerVolumeSpecName "kube-api-access-mtl8q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:11.568953 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.568911 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtl8q\" (UniqueName: \"kubernetes.io/projected/6a55fefe-543b-4e58-b85c-9b2743549858-kube-api-access-mtl8q\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:11.568953 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.568951 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:11.569122 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.568966 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55fefe-543b-4e58-b85c-9b2743549858-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:11.569122 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.568979 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:11.569122 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.568991 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a55fefe-543b-4e58-b85c-9b2743549858-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:11.893279 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.893238 2568 generic.go:358] "Generic (PLEG): container finished" podID="6a55fefe-543b-4e58-b85c-9b2743549858" containerID="bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2" exitCode=0 Apr 24 21:52:11.893440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.893324 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" event={"ID":"6a55fefe-543b-4e58-b85c-9b2743549858","Type":"ContainerDied","Data":"bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2"} Apr 24 21:52:11.893440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.893350 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" Apr 24 21:52:11.893440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.893366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt" event={"ID":"6a55fefe-543b-4e58-b85c-9b2743549858","Type":"ContainerDied","Data":"806968ec7324bcba6b687a374bd01958aa661ac599c8ded0f3682d4ff22dfd05"} Apr 24 21:52:11.893440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.893383 2568 scope.go:117] "RemoveContainer" containerID="bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2" Apr 24 21:52:11.902231 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.902214 2568 scope.go:117] "RemoveContainer" containerID="2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7" Apr 24 21:52:11.909342 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.909326 2568 scope.go:117] "RemoveContainer" containerID="dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd" Apr 24 21:52:11.917019 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.917000 2568 scope.go:117] "RemoveContainer" containerID="bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2" Apr 24 21:52:11.917297 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:52:11.917272 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2\": container with ID starting with bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2 not found: ID does not exist" containerID="bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2" Apr 24 21:52:11.917377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.917304 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2"} err="failed to get container status \"bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2\": rpc error: code = NotFound desc = could not find container \"bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2\": container with ID starting with bdcd9bae9f67f8a80afa34801eeafa38699f5a2f4bc736435bd62294301a78c2 not found: ID does not exist" Apr 24 21:52:11.917377 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.917322 2568 scope.go:117] "RemoveContainer" containerID="2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7" Apr 24 21:52:11.917529 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:52:11.917512 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7\": container with ID starting with 2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7 not found: ID does not exist" containerID="2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7" Apr 24 21:52:11.917583 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.917532 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7"} err="failed to get container status \"2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7\": rpc error: code = NotFound desc = could not find container \"2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7\": container with ID starting with 2438df8f7645215fdace7aedce1485846408ada44f9fbcf1969bcda7e6cd76e7 not found: ID does not exist" Apr 24 21:52:11.917583 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.917547 2568 scope.go:117] "RemoveContainer" containerID="dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd" Apr 24 21:52:11.917811 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.917788 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt"] Apr 24 21:52:11.917897 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:52:11.917795 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd\": container with ID starting with dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd not found: ID does not exist" containerID="dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd" Apr 24 21:52:11.917897 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.917832 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd"} err="failed to get container status \"dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd\": rpc error: code = NotFound desc = could not find container \"dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd\": container with ID starting with dc258736741c9134ddfd849ca812d50cf79f8e6eb9b0879d6aa4c93030217cfd not found: ID does not exist" Apr 24 21:52:11.924093 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:11.924071 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-79b8cd6b7f-l75rt"] Apr 24 21:52:13.917465 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:13.917420 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" path="/var/lib/kubelet/pods/6a55fefe-543b-4e58-b85c-9b2743549858/volumes" Apr 24 21:52:17.872836 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.872800 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn"] Apr 24 21:52:17.873317 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.873224 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="main" Apr 24 21:52:17.873317 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.873237 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="main" Apr 24 21:52:17.873317 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.873246 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="tokenizer" Apr 24 21:52:17.873317 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.873252 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="tokenizer" Apr 24 21:52:17.873317 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.873270 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="storage-initializer" Apr 24 21:52:17.873317 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.873278 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="storage-initializer" Apr 24 21:52:17.873671 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.873398 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="main" Apr 24 21:52:17.873671 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.873413 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a55fefe-543b-4e58-b85c-9b2743549858" containerName="tokenizer" Apr 24 21:52:17.878275 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.878256 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:17.880668 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.880648 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-6cvzw\"" Apr 24 21:52:17.880802 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.880650 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 24 21:52:17.887815 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:17.887795 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn"] Apr 24 21:52:18.023470 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.023439 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.023636 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.023492 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.023636 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.023509 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.023636 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.023548 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdrh4\" (UniqueName: \"kubernetes.io/projected/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kube-api-access-sdrh4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.023636 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.023564 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.023636 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.023583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.124852 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.124777 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdrh4\" (UniqueName: \"kubernetes.io/projected/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kube-api-access-sdrh4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.124852 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.124810 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.124852 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.124834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.125122 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.124875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.125122 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.124906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.125122 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.124923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.125255 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.125231 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.125294 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.125271 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.125360 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.125341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.125396 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.125343 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.127347 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.127331 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.133611 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.133588 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdrh4\" (UniqueName: \"kubernetes.io/projected/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kube-api-access-sdrh4\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.188603 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.188576 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:18.311188 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.311164 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn"] Apr 24 21:52:18.312837 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:52:18.312805 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6df5edb_fcdf_48a4_bb0d_7dacbc4bb4e4.slice/crio-f00ad76f9ed34fa5caf99d775e5750d4e848c7ee1319cbbbe8c4a19e97bcb127 WatchSource:0}: Error finding container f00ad76f9ed34fa5caf99d775e5750d4e848c7ee1319cbbbe8c4a19e97bcb127: Status 404 returned error can't find the container with id f00ad76f9ed34fa5caf99d775e5750d4e848c7ee1319cbbbe8c4a19e97bcb127 Apr 24 21:52:18.314752 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.314732 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:52:18.920298 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.920266 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" event={"ID":"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4","Type":"ContainerStarted","Data":"d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa"} Apr 24 21:52:18.920298 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:18.920298 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" event={"ID":"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4","Type":"ContainerStarted","Data":"f00ad76f9ed34fa5caf99d775e5750d4e848c7ee1319cbbbe8c4a19e97bcb127"} Apr 24 21:52:19.925128 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:19.925089 2568 generic.go:358] "Generic (PLEG): container finished" podID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerID="d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa" exitCode=0 Apr 24 21:52:19.925689 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:19.925135 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" event={"ID":"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4","Type":"ContainerDied","Data":"d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa"} Apr 24 21:52:20.930860 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:20.930821 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" event={"ID":"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4","Type":"ContainerStarted","Data":"5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503"} Apr 24 21:52:20.930860 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:20.930864 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" event={"ID":"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4","Type":"ContainerStarted","Data":"7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3"} Apr 24 21:52:20.931404 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:20.930898 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:20.957252 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:20.957208 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" podStartSLOduration=3.957194774 podStartE2EDuration="3.957194774s" podCreationTimestamp="2026-04-24 21:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:52:20.955580644 +0000 UTC m=+1465.558649427" watchObservedRunningTime="2026-04-24 21:52:20.957194774 +0000 UTC m=+1465.560263594" Apr 24 21:52:26.802195 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:26.802161 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb"] Apr 24 21:52:26.802651 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:26.802580 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="main" containerID="cri-o://15a6d599d053210761955df04f2e6e9e950a4f6fbb7e85f7ece69cbcd5fe2c89" gracePeriod=30 Apr 24 21:52:26.802735 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:26.802669 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="tokenizer" containerID="cri-o://b8bab41078dc0fdb37da4ff2f8f8918544e64f7f9da5c048414a8f361c105472" gracePeriod=30 Apr 24 21:52:26.954656 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:26.954617 2568 generic.go:358] "Generic (PLEG): container finished" podID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerID="15a6d599d053210761955df04f2e6e9e950a4f6fbb7e85f7ece69cbcd5fe2c89" exitCode=0 Apr 24 21:52:26.954836 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:26.954688 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" event={"ID":"6dd9794c-99c2-475c-a703-764c9e74f02b","Type":"ContainerDied","Data":"15a6d599d053210761955df04f2e6e9e950a4f6fbb7e85f7ece69cbcd5fe2c89"} Apr 24 21:52:27.567786 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:52:27.567756 2568 logging.go:55] [core] [Channel #494 SubChannel #495]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.40:9003", ServerName: "10.134.0.40:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.40:9003: connect: connection refused" Apr 24 21:52:27.960048 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:27.960018 2568 generic.go:358] "Generic (PLEG): container finished" podID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerID="b8bab41078dc0fdb37da4ff2f8f8918544e64f7f9da5c048414a8f361c105472" exitCode=0 Apr 24 21:52:27.960340 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:27.960091 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" event={"ID":"6dd9794c-99c2-475c-a703-764c9e74f02b","Type":"ContainerDied","Data":"b8bab41078dc0fdb37da4ff2f8f8918544e64f7f9da5c048414a8f361c105472"} Apr 24 21:52:27.960340 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:27.960127 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" event={"ID":"6dd9794c-99c2-475c-a703-764c9e74f02b","Type":"ContainerDied","Data":"29fc60fad2804b3caba5c0f0be9cd8767fff4dd760c8c310ed51b117cb27b584"} Apr 24 21:52:27.960340 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:27.960138 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29fc60fad2804b3caba5c0f0be9cd8767fff4dd760c8c310ed51b117cb27b584" Apr 24 21:52:27.962786 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:27.962769 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:52:28.116829 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.116753 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-cache\") pod \"6dd9794c-99c2-475c-a703-764c9e74f02b\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " Apr 24 21:52:28.116829 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.116796 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vvsz\" (UniqueName: \"kubernetes.io/projected/6dd9794c-99c2-475c-a703-764c9e74f02b-kube-api-access-4vvsz\") pod \"6dd9794c-99c2-475c-a703-764c9e74f02b\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " Apr 24 21:52:28.116829 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.116817 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-kserve-provision-location\") pod \"6dd9794c-99c2-475c-a703-764c9e74f02b\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " Apr 24 21:52:28.117103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.116852 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd9794c-99c2-475c-a703-764c9e74f02b-tls-certs\") pod \"6dd9794c-99c2-475c-a703-764c9e74f02b\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " Apr 24 21:52:28.117103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.116909 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-uds\") pod \"6dd9794c-99c2-475c-a703-764c9e74f02b\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " Apr 24 21:52:28.117103 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.117011 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-tmp\") pod \"6dd9794c-99c2-475c-a703-764c9e74f02b\" (UID: \"6dd9794c-99c2-475c-a703-764c9e74f02b\") " Apr 24 21:52:28.117261 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.117117 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6dd9794c-99c2-475c-a703-764c9e74f02b" (UID: "6dd9794c-99c2-475c-a703-764c9e74f02b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:28.117327 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.117277 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6dd9794c-99c2-475c-a703-764c9e74f02b" (UID: "6dd9794c-99c2-475c-a703-764c9e74f02b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:28.117427 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.117408 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:28.117482 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.117434 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:28.117482 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.117447 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6dd9794c-99c2-475c-a703-764c9e74f02b" (UID: "6dd9794c-99c2-475c-a703-764c9e74f02b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:28.117810 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.117781 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6dd9794c-99c2-475c-a703-764c9e74f02b" (UID: "6dd9794c-99c2-475c-a703-764c9e74f02b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:28.119057 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.119025 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd9794c-99c2-475c-a703-764c9e74f02b-kube-api-access-4vvsz" (OuterVolumeSpecName: "kube-api-access-4vvsz") pod "6dd9794c-99c2-475c-a703-764c9e74f02b" (UID: "6dd9794c-99c2-475c-a703-764c9e74f02b"). InnerVolumeSpecName "kube-api-access-4vvsz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:28.119319 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.119298 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd9794c-99c2-475c-a703-764c9e74f02b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6dd9794c-99c2-475c-a703-764c9e74f02b" (UID: "6dd9794c-99c2-475c-a703-764c9e74f02b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:28.189372 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.189340 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:28.189517 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.189497 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:28.192039 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.192020 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:28.218881 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.218856 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vvsz\" (UniqueName: \"kubernetes.io/projected/6dd9794c-99c2-475c-a703-764c9e74f02b-kube-api-access-4vvsz\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:28.218881 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.218877 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:28.219019 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.218888 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd9794c-99c2-475c-a703-764c9e74f02b-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:28.219019 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.218898 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6dd9794c-99c2-475c-a703-764c9e74f02b-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:52:28.567795 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.567756 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.40:9003\" within 1s: context deadline exceeded" Apr 24 21:52:28.963193 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.963105 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb" Apr 24 21:52:28.964455 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:28.964434 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:29.003911 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:29.003889 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb"] Apr 24 21:52:29.008648 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:29.008625 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-67d85d6d77-lb5bb"] Apr 24 21:52:29.917102 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:29.917067 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" path="/var/lib/kubelet/pods/6dd9794c-99c2-475c-a703-764c9e74f02b/volumes" Apr 24 21:52:50.970846 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:50.970817 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:52:55.905780 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:55.905750 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:52:55.908513 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:52:55.908484 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:53:04.086982 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.086949 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6c58f78c97-glpn7"] Apr 24 21:53:04.087369 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.087199 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" podUID="30d4e8b1-4a7c-4475-965e-71681ac30d9e" containerName="manager" containerID="cri-o://c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78" gracePeriod=30 Apr 24 21:53:04.324320 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.324296 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:53:04.414592 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.414523 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4fhw\" (UniqueName: \"kubernetes.io/projected/30d4e8b1-4a7c-4475-965e-71681ac30d9e-kube-api-access-j4fhw\") pod \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\" (UID: \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\") " Apr 24 21:53:04.414592 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.414554 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30d4e8b1-4a7c-4475-965e-71681ac30d9e-cert\") pod \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\" (UID: \"30d4e8b1-4a7c-4475-965e-71681ac30d9e\") " Apr 24 21:53:04.416602 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.416578 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d4e8b1-4a7c-4475-965e-71681ac30d9e-cert" (OuterVolumeSpecName: "cert") pod "30d4e8b1-4a7c-4475-965e-71681ac30d9e" (UID: "30d4e8b1-4a7c-4475-965e-71681ac30d9e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:53:04.416681 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.416631 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d4e8b1-4a7c-4475-965e-71681ac30d9e-kube-api-access-j4fhw" (OuterVolumeSpecName: "kube-api-access-j4fhw") pod "30d4e8b1-4a7c-4475-965e-71681ac30d9e" (UID: "30d4e8b1-4a7c-4475-965e-71681ac30d9e"). InnerVolumeSpecName "kube-api-access-j4fhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:53:04.516027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.516004 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4fhw\" (UniqueName: \"kubernetes.io/projected/30d4e8b1-4a7c-4475-965e-71681ac30d9e-kube-api-access-j4fhw\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:53:04.516027 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:04.516026 2568 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30d4e8b1-4a7c-4475-965e-71681ac30d9e-cert\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:53:05.085477 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.085439 2568 generic.go:358] "Generic (PLEG): container finished" podID="30d4e8b1-4a7c-4475-965e-71681ac30d9e" containerID="c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78" exitCode=0 Apr 24 21:53:05.085676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.085501 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" Apr 24 21:53:05.085676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.085521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" event={"ID":"30d4e8b1-4a7c-4475-965e-71681ac30d9e","Type":"ContainerDied","Data":"c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78"} Apr 24 21:53:05.085676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.085557 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6c58f78c97-glpn7" event={"ID":"30d4e8b1-4a7c-4475-965e-71681ac30d9e","Type":"ContainerDied","Data":"b98f2c2a961e1d601ca32ba8dca95df0fc1da7dc8c676102376d8f38b793fc51"} Apr 24 21:53:05.085676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.085573 2568 scope.go:117] "RemoveContainer" containerID="c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78" Apr 24 21:53:05.094425 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.094245 2568 scope.go:117] "RemoveContainer" containerID="c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78" Apr 24 21:53:05.094646 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:53:05.094493 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78\": container with ID starting with c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78 not found: ID does not exist" containerID="c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78" Apr 24 21:53:05.094646 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.094512 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78"} err="failed to get container status \"c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78\": rpc error: code = NotFound desc = could not find container \"c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78\": container with ID starting with c13979fc0f6a649411e6338c34efcb54183d514b37f57ce1f3418eb08b1a3a78 not found: ID does not exist" Apr 24 21:53:05.106341 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.106316 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-6c58f78c97-glpn7"] Apr 24 21:53:05.112250 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.112227 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-6c58f78c97-glpn7"] Apr 24 21:53:05.917429 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:53:05.917397 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d4e8b1-4a7c-4475-965e-71681ac30d9e" path="/var/lib/kubelet/pods/30d4e8b1-4a7c-4475-965e-71681ac30d9e/volumes" Apr 24 21:56:21.131350 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:21.131271 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn"] Apr 24 21:56:21.131814 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:21.131552 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="main" containerID="cri-o://7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3" gracePeriod=30 Apr 24 21:56:21.131814 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:21.131601 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="tokenizer" containerID="cri-o://5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503" gracePeriod=30 Apr 24 21:56:21.751916 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:21.751883 2568 generic.go:358] "Generic (PLEG): container finished" podID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerID="7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3" exitCode=0 Apr 24 21:56:21.752091 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:21.751926 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" event={"ID":"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4","Type":"ContainerDied","Data":"7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3"} Apr 24 21:56:22.273444 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.273421 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:56:22.420793 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.420724 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kserve-provision-location\") pod \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " Apr 24 21:56:22.420957 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.420793 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-tmp\") pod \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " Apr 24 21:56:22.420957 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.420828 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-uds\") pod \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " Apr 24 21:56:22.420957 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.420846 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tls-certs\") pod \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " Apr 24 21:56:22.420957 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.420872 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdrh4\" (UniqueName: \"kubernetes.io/projected/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kube-api-access-sdrh4\") pod \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " Apr 24 21:56:22.421178 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.420914 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-cache\") pod \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\" (UID: \"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4\") " Apr 24 21:56:22.421178 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.421122 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" (UID: "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:22.421281 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.421249 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" (UID: "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:22.421281 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.421263 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" (UID: "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:22.421355 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.421280 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:56:22.421540 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.421518 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" (UID: "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:22.423025 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.422996 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" (UID: "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:22.423125 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.423044 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kube-api-access-sdrh4" (OuterVolumeSpecName: "kube-api-access-sdrh4") pod "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" (UID: "f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4"). InnerVolumeSpecName "kube-api-access-sdrh4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:22.522440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.522416 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:56:22.522440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.522439 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdrh4\" (UniqueName: \"kubernetes.io/projected/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kube-api-access-sdrh4\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:56:22.522602 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.522449 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:56:22.522602 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.522459 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:56:22.522602 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.522469 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 21:56:22.757904 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.757868 2568 generic.go:358] "Generic (PLEG): container finished" podID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerID="5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503" exitCode=0 Apr 24 21:56:22.758068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.757934 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" Apr 24 21:56:22.758068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.757958 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" event={"ID":"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4","Type":"ContainerDied","Data":"5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503"} Apr 24 21:56:22.758068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.758003 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn" event={"ID":"f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4","Type":"ContainerDied","Data":"f00ad76f9ed34fa5caf99d775e5750d4e848c7ee1319cbbbe8c4a19e97bcb127"} Apr 24 21:56:22.758068 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.758026 2568 scope.go:117] "RemoveContainer" containerID="5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503" Apr 24 21:56:22.766645 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.766626 2568 scope.go:117] "RemoveContainer" containerID="7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3" Apr 24 21:56:22.773778 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.773760 2568 scope.go:117] "RemoveContainer" containerID="d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa" Apr 24 21:56:22.780319 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.780302 2568 scope.go:117] "RemoveContainer" containerID="5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503" Apr 24 21:56:22.780562 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:56:22.780533 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503\": container with ID starting with 5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503 not found: ID does not exist" containerID="5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503" Apr 24 21:56:22.780649 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.780561 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503"} err="failed to get container status \"5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503\": rpc error: code = NotFound desc = could not find container \"5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503\": container with ID starting with 5a53dafefa99a4321f1f96fec38ed02dea4d0248e1a332dfbd1d8b38bf651503 not found: ID does not exist" Apr 24 21:56:22.780649 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.780579 2568 scope.go:117] "RemoveContainer" containerID="7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3" Apr 24 21:56:22.780881 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:56:22.780861 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3\": container with ID starting with 7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3 not found: ID does not exist" containerID="7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3" Apr 24 21:56:22.780979 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.780889 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3"} err="failed to get container status \"7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3\": rpc error: code = NotFound desc = could not find container \"7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3\": container with ID starting with 7eab14f03ca1993f6f52ef3ed53a4a64586fd36a0dc66a772e9af706d8ad26d3 not found: ID does not exist" Apr 24 21:56:22.780979 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.780910 2568 scope.go:117] "RemoveContainer" containerID="d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa" Apr 24 21:56:22.781177 ip-10-0-134-232 kubenswrapper[2568]: E0424 21:56:22.781154 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa\": container with ID starting with d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa not found: ID does not exist" containerID="d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa" Apr 24 21:56:22.781324 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.781183 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa"} err="failed to get container status \"d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa\": rpc error: code = NotFound desc = could not find container \"d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa\": container with ID starting with d92bfc31bb59667eb09ec9c6f56d81c2d29e2ced22d07039049019882d0ff0fa not found: ID does not exist" Apr 24 21:56:22.783061 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.783041 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn"] Apr 24 21:56:22.789340 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:22.789321 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schexhbwn"] Apr 24 21:56:23.917593 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:23.917559 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" path="/var/lib/kubelet/pods/f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4/volumes" Apr 24 21:56:30.995132 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995105 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7"] Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995435 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="main" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995446 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="main" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995456 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30d4e8b1-4a7c-4475-965e-71681ac30d9e" containerName="manager" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995462 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d4e8b1-4a7c-4475-965e-71681ac30d9e" containerName="manager" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995470 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="tokenizer" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995476 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="tokenizer" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995484 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="tokenizer" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995489 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="tokenizer" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995498 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="main" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995502 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="main" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995510 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="storage-initializer" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995515 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="storage-initializer" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995524 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="storage-initializer" Apr 24 21:56:30.995576 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995529 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="storage-initializer" Apr 24 21:56:30.996043 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995600 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="main" Apr 24 21:56:30.996043 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995610 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="tokenizer" Apr 24 21:56:30.996043 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995617 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dd9794c-99c2-475c-a703-764c9e74f02b" containerName="tokenizer" Apr 24 21:56:30.996043 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995623 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="30d4e8b1-4a7c-4475-965e-71681ac30d9e" containerName="manager" Apr 24 21:56:30.996043 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.995630 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6df5edb-fcdf-48a4-bb0d-7dacbc4bb4e4" containerName="main" Apr 24 21:56:30.998734 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:30.998698 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.002676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.002655 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:56:31.002676 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.002655 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:56:31.003053 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.002655 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 24 21:56:31.003053 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.002989 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:56:31.003053 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.002992 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-8f7c5\"" Apr 24 21:56:31.016561 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.016537 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7"] Apr 24 21:56:31.099300 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.099275 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.099445 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.099305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.099445 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.099326 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk952\" (UniqueName: \"kubernetes.io/projected/b2896189-5142-4b52-a039-48b5bf4a02d8-kube-api-access-lk952\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.099528 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.099455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.099528 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.099482 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.099528 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.099503 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2896189-5142-4b52-a039-48b5bf4a02d8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.200897 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.200862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.200897 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.200897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.201104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.200929 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2896189-5142-4b52-a039-48b5bf4a02d8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.201104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.201009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.201104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.201035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.201104 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.201060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk952\" (UniqueName: \"kubernetes.io/projected/b2896189-5142-4b52-a039-48b5bf4a02d8-kube-api-access-lk952\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.201314 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.201291 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.201356 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.201326 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.201418 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.201366 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.201418 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.201387 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.203422 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.203406 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2896189-5142-4b52-a039-48b5bf4a02d8-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.210793 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.210773 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk952\" (UniqueName: \"kubernetes.io/projected/b2896189-5142-4b52-a039-48b5bf4a02d8-kube-api-access-lk952\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.307867 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.307807 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:31.427833 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.427803 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7"] Apr 24 21:56:31.431346 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:56:31.431320 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2896189_5142_4b52_a039_48b5bf4a02d8.slice/crio-43d3197801eacf4a0b2cbe420ae6b861db455a3ccc1ec147c4723c6109cab0b1 WatchSource:0}: Error finding container 43d3197801eacf4a0b2cbe420ae6b861db455a3ccc1ec147c4723c6109cab0b1: Status 404 returned error can't find the container with id 43d3197801eacf4a0b2cbe420ae6b861db455a3ccc1ec147c4723c6109cab0b1 Apr 24 21:56:31.791331 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.791292 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" event={"ID":"b2896189-5142-4b52-a039-48b5bf4a02d8","Type":"ContainerStarted","Data":"7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222"} Apr 24 21:56:31.791331 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:31.791335 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" event={"ID":"b2896189-5142-4b52-a039-48b5bf4a02d8","Type":"ContainerStarted","Data":"43d3197801eacf4a0b2cbe420ae6b861db455a3ccc1ec147c4723c6109cab0b1"} Apr 24 21:56:32.796431 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:32.796396 2568 generic.go:358] "Generic (PLEG): container finished" podID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerID="7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222" exitCode=0 Apr 24 21:56:32.796801 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:32.796458 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" event={"ID":"b2896189-5142-4b52-a039-48b5bf4a02d8","Type":"ContainerDied","Data":"7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222"} Apr 24 21:56:33.802299 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:33.802255 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" event={"ID":"b2896189-5142-4b52-a039-48b5bf4a02d8","Type":"ContainerStarted","Data":"a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88"} Apr 24 21:56:33.802299 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:33.802303 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" event={"ID":"b2896189-5142-4b52-a039-48b5bf4a02d8","Type":"ContainerStarted","Data":"bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529"} Apr 24 21:56:33.802735 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:33.802381 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:33.829274 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:33.829224 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" podStartSLOduration=3.829207504 podStartE2EDuration="3.829207504s" podCreationTimestamp="2026-04-24 21:56:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:33.82601955 +0000 UTC m=+1718.429088332" watchObservedRunningTime="2026-04-24 21:56:33.829207504 +0000 UTC m=+1718.432276289" Apr 24 21:56:41.308830 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:41.308781 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:41.309230 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:41.308934 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:41.311620 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:41.311594 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:41.835375 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:41.835347 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:56:55.994635 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:55.994607 2568 scope.go:117] "RemoveContainer" containerID="b8bab41078dc0fdb37da4ff2f8f8918544e64f7f9da5c048414a8f361c105472" Apr 24 21:56:56.002545 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:56.002524 2568 scope.go:117] "RemoveContainer" containerID="15a6d599d053210761955df04f2e6e9e950a4f6fbb7e85f7ece69cbcd5fe2c89" Apr 24 21:56:56.009577 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:56:56.009556 2568 scope.go:117] "RemoveContainer" containerID="248b1daadaa8c6fb96387cc4f6a554ea232e485415442f00d91a294c930c1e8b" Apr 24 21:57:03.841805 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:57:03.841776 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 21:57:55.929650 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:57:55.929618 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:57:55.933546 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:57:55.933526 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 21:58:00.351439 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.351402 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw"] Apr 24 21:58:00.355299 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.355278 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.359066 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.359043 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-zpxvx\"" Apr 24 21:58:00.359906 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.359884 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 21:58:00.369943 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.369924 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw"] Apr 24 21:58:00.443184 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.443161 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.443303 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.443192 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.443303 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.443211 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.443303 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.443258 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.443303 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.443301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.443445 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.443318 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpfr\" (UniqueName: \"kubernetes.io/projected/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kube-api-access-tkpfr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.544545 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.544521 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.544662 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.544552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.544662 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.544572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.544824 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.544724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.544824 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.544781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.544824 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.544811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpfr\" (UniqueName: \"kubernetes.io/projected/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kube-api-access-tkpfr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.544985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.544893 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.544985 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.544959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.545109 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.545022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.545175 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.545154 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.547121 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.547101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.555390 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.555365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpfr\" (UniqueName: \"kubernetes.io/projected/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kube-api-access-tkpfr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.665292 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.665216 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:00.791440 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.791411 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw"] Apr 24 21:58:00.793512 ip-10-0-134-232 kubenswrapper[2568]: W0424 21:58:00.793485 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af7a02a_4bbe_45be_ab19_c3a4030fb83f.slice/crio-a529e70d08e25d0239d39426527d0ab2a28792bb4f4ecbf770b6ca7a9914d71f WatchSource:0}: Error finding container a529e70d08e25d0239d39426527d0ab2a28792bb4f4ecbf770b6ca7a9914d71f: Status 404 returned error can't find the container with id a529e70d08e25d0239d39426527d0ab2a28792bb4f4ecbf770b6ca7a9914d71f Apr 24 21:58:00.795551 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:00.795532 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:58:01.100344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:01.100301 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" event={"ID":"4af7a02a-4bbe-45be-ab19-c3a4030fb83f","Type":"ContainerStarted","Data":"ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9"} Apr 24 21:58:01.100344 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:01.100350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" event={"ID":"4af7a02a-4bbe-45be-ab19-c3a4030fb83f","Type":"ContainerStarted","Data":"a529e70d08e25d0239d39426527d0ab2a28792bb4f4ecbf770b6ca7a9914d71f"} Apr 24 21:58:02.105616 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:02.105578 2568 generic.go:358] "Generic (PLEG): container finished" podID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerID="ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9" exitCode=0 Apr 24 21:58:02.106007 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:02.105664 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" event={"ID":"4af7a02a-4bbe-45be-ab19-c3a4030fb83f","Type":"ContainerDied","Data":"ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9"} Apr 24 21:58:03.111883 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:03.111847 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" event={"ID":"4af7a02a-4bbe-45be-ab19-c3a4030fb83f","Type":"ContainerStarted","Data":"05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c"} Apr 24 21:58:03.111883 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:03.111884 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" event={"ID":"4af7a02a-4bbe-45be-ab19-c3a4030fb83f","Type":"ContainerStarted","Data":"fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20"} Apr 24 21:58:03.112333 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:03.111958 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:03.148032 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:03.147983 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" podStartSLOduration=3.147965485 podStartE2EDuration="3.147965485s" podCreationTimestamp="2026-04-24 21:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:58:03.145924362 +0000 UTC m=+1807.748993145" watchObservedRunningTime="2026-04-24 21:58:03.147965485 +0000 UTC m=+1807.751034266" Apr 24 21:58:10.666329 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:10.666296 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:10.666734 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:10.666340 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:10.668787 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:10.668765 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:11.139210 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:11.139177 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 21:58:32.142557 ip-10-0-134-232 kubenswrapper[2568]: I0424 21:58:32.142530 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 22:02:55.954640 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:02:55.954612 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 22:02:55.959328 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:02:55.959310 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 22:07:55.979976 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:07:55.979943 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 22:07:55.986176 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:07:55.986155 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 22:11:24.209188 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:24.209098 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7"] Apr 24 22:11:24.209683 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:24.209498 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="main" containerID="cri-o://bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529" gracePeriod=30 Apr 24 22:11:24.209683 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:24.209552 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="tokenizer" containerID="cri-o://a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88" gracePeriod=30 Apr 24 22:11:24.839219 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:24.839180 2568 generic.go:358] "Generic (PLEG): container finished" podID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerID="bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529" exitCode=0 Apr 24 22:11:24.839396 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:24.839252 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" event={"ID":"b2896189-5142-4b52-a039-48b5bf4a02d8","Type":"ContainerDied","Data":"bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529"} Apr 24 22:11:25.360155 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.360136 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 22:11:25.479010 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.478931 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk952\" (UniqueName: \"kubernetes.io/projected/b2896189-5142-4b52-a039-48b5bf4a02d8-kube-api-access-lk952\") pod \"b2896189-5142-4b52-a039-48b5bf4a02d8\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " Apr 24 22:11:25.479010 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.478966 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-tmp\") pod \"b2896189-5142-4b52-a039-48b5bf4a02d8\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " Apr 24 22:11:25.479010 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.478989 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-uds\") pod \"b2896189-5142-4b52-a039-48b5bf4a02d8\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " Apr 24 22:11:25.479010 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.479004 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-cache\") pod \"b2896189-5142-4b52-a039-48b5bf4a02d8\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " Apr 24 22:11:25.479328 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.479025 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-kserve-provision-location\") pod \"b2896189-5142-4b52-a039-48b5bf4a02d8\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " Apr 24 22:11:25.479328 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.479090 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2896189-5142-4b52-a039-48b5bf4a02d8-tls-certs\") pod \"b2896189-5142-4b52-a039-48b5bf4a02d8\" (UID: \"b2896189-5142-4b52-a039-48b5bf4a02d8\") " Apr 24 22:11:25.479426 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.479338 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b2896189-5142-4b52-a039-48b5bf4a02d8" (UID: "b2896189-5142-4b52-a039-48b5bf4a02d8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:25.479426 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.479346 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b2896189-5142-4b52-a039-48b5bf4a02d8" (UID: "b2896189-5142-4b52-a039-48b5bf4a02d8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:25.479426 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.479383 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b2896189-5142-4b52-a039-48b5bf4a02d8" (UID: "b2896189-5142-4b52-a039-48b5bf4a02d8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:25.479821 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.479801 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b2896189-5142-4b52-a039-48b5bf4a02d8" (UID: "b2896189-5142-4b52-a039-48b5bf4a02d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:25.481211 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.481193 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2896189-5142-4b52-a039-48b5bf4a02d8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b2896189-5142-4b52-a039-48b5bf4a02d8" (UID: "b2896189-5142-4b52-a039-48b5bf4a02d8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:11:25.481321 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.481302 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2896189-5142-4b52-a039-48b5bf4a02d8-kube-api-access-lk952" (OuterVolumeSpecName: "kube-api-access-lk952") pod "b2896189-5142-4b52-a039-48b5bf4a02d8" (UID: "b2896189-5142-4b52-a039-48b5bf4a02d8"). InnerVolumeSpecName "kube-api-access-lk952". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:11:25.580374 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.580345 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lk952\" (UniqueName: \"kubernetes.io/projected/b2896189-5142-4b52-a039-48b5bf4a02d8-kube-api-access-lk952\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:11:25.580374 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.580368 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:11:25.580537 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.580382 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:11:25.580537 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.580394 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:11:25.580537 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.580408 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2896189-5142-4b52-a039-48b5bf4a02d8-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:11:25.580537 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.580419 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2896189-5142-4b52-a039-48b5bf4a02d8-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:11:25.844837 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.844799 2568 generic.go:358] "Generic (PLEG): container finished" podID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerID="a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88" exitCode=0 Apr 24 22:11:25.845048 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.844852 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" event={"ID":"b2896189-5142-4b52-a039-48b5bf4a02d8","Type":"ContainerDied","Data":"a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88"} Apr 24 22:11:25.845048 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.844882 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" Apr 24 22:11:25.845048 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.844895 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7" event={"ID":"b2896189-5142-4b52-a039-48b5bf4a02d8","Type":"ContainerDied","Data":"43d3197801eacf4a0b2cbe420ae6b861db455a3ccc1ec147c4723c6109cab0b1"} Apr 24 22:11:25.845048 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.844912 2568 scope.go:117] "RemoveContainer" containerID="a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88" Apr 24 22:11:25.852905 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.852886 2568 scope.go:117] "RemoveContainer" containerID="bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529" Apr 24 22:11:25.860164 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.860147 2568 scope.go:117] "RemoveContainer" containerID="7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222" Apr 24 22:11:25.866768 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.866750 2568 scope.go:117] "RemoveContainer" containerID="a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88" Apr 24 22:11:25.866983 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:11:25.866967 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88\": container with ID starting with a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88 not found: ID does not exist" containerID="a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88" Apr 24 22:11:25.867037 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.866993 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88"} err="failed to get container status \"a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88\": rpc error: code = NotFound desc = could not find container \"a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88\": container with ID starting with a3d6d3cdb8936830d61572f33d45d73b95501335e1ffae74467d3966940c1b88 not found: ID does not exist" Apr 24 22:11:25.867037 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.867008 2568 scope.go:117] "RemoveContainer" containerID="bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529" Apr 24 22:11:25.867263 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:11:25.867235 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529\": container with ID starting with bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529 not found: ID does not exist" containerID="bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529" Apr 24 22:11:25.867338 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.867261 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529"} err="failed to get container status \"bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529\": rpc error: code = NotFound desc = could not find container \"bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529\": container with ID starting with bfb5ed1540252cc5030e943c46c4cf8d7bf1f57e95a0a3f49f4445b7a6a91529 not found: ID does not exist" Apr 24 22:11:25.867338 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.867283 2568 scope.go:117] "RemoveContainer" containerID="7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222" Apr 24 22:11:25.867534 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:11:25.867518 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222\": container with ID starting with 7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222 not found: ID does not exist" containerID="7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222" Apr 24 22:11:25.867580 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.867542 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222"} err="failed to get container status \"7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222\": rpc error: code = NotFound desc = could not find container \"7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222\": container with ID starting with 7ab492f4c29786a84a7d625edca7e4e7c9af32977495157d904c1584e0a60222 not found: ID does not exist" Apr 24 22:11:25.885086 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.885063 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7"] Apr 24 22:11:25.886657 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.886637 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-79f868l9p7"] Apr 24 22:11:25.916273 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:25.916252 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" path="/var/lib/kubelet/pods/b2896189-5142-4b52-a039-48b5bf4a02d8/volumes" Apr 24 22:11:36.640717 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.640678 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss"] Apr 24 22:11:36.641153 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.641121 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="storage-initializer" Apr 24 22:11:36.641153 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.641136 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="storage-initializer" Apr 24 22:11:36.641153 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.641143 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="main" Apr 24 22:11:36.641153 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.641148 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="main" Apr 24 22:11:36.641292 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.641159 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="tokenizer" Apr 24 22:11:36.641292 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.641164 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="tokenizer" Apr 24 22:11:36.641292 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.641234 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="main" Apr 24 22:11:36.641292 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.641244 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2896189-5142-4b52-a039-48b5bf4a02d8" containerName="tokenizer" Apr 24 22:11:36.644361 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.644345 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.646920 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.646894 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 24 22:11:36.647050 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.646926 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-wfbfd\"" Apr 24 22:11:36.654507 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.654485 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss"] Apr 24 22:11:36.771476 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.771441 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.771639 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.771520 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w48pf\" (UniqueName: \"kubernetes.io/projected/d06de066-ad28-4e5a-8cb7-fc225af37098-kube-api-access-w48pf\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.771639 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.771561 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.771639 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.771633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d06de066-ad28-4e5a-8cb7-fc225af37098-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.771843 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.771664 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.771843 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.771695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873030 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873001 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873191 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w48pf\" (UniqueName: \"kubernetes.io/projected/d06de066-ad28-4e5a-8cb7-fc225af37098-kube-api-access-w48pf\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873191 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873191 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d06de066-ad28-4e5a-8cb7-fc225af37098-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873191 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873146 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873191 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873165 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873494 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873470 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873547 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873508 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873587 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873540 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.873587 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.873570 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.875476 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.875460 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d06de066-ad28-4e5a-8cb7-fc225af37098-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.882944 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.882917 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w48pf\" (UniqueName: \"kubernetes.io/projected/d06de066-ad28-4e5a-8cb7-fc225af37098-kube-api-access-w48pf\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:36.954610 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:36.954552 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:37.281179 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:37.281154 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss"] Apr 24 22:11:37.283557 ip-10-0-134-232 kubenswrapper[2568]: W0424 22:11:37.283523 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd06de066_ad28_4e5a_8cb7_fc225af37098.slice/crio-84a080a5222b13030ae612b58d05e10b0e7417a521f033a31ceaeef741436a57 WatchSource:0}: Error finding container 84a080a5222b13030ae612b58d05e10b0e7417a521f033a31ceaeef741436a57: Status 404 returned error can't find the container with id 84a080a5222b13030ae612b58d05e10b0e7417a521f033a31ceaeef741436a57 Apr 24 22:11:37.285828 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:37.285810 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:11:37.889469 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:37.889427 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" event={"ID":"d06de066-ad28-4e5a-8cb7-fc225af37098","Type":"ContainerStarted","Data":"d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524"} Apr 24 22:11:37.889469 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:37.889468 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" event={"ID":"d06de066-ad28-4e5a-8cb7-fc225af37098","Type":"ContainerStarted","Data":"84a080a5222b13030ae612b58d05e10b0e7417a521f033a31ceaeef741436a57"} Apr 24 22:11:38.894957 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:38.894919 2568 generic.go:358] "Generic (PLEG): container finished" podID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerID="d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524" exitCode=0 Apr 24 22:11:38.895309 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:38.894964 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" event={"ID":"d06de066-ad28-4e5a-8cb7-fc225af37098","Type":"ContainerDied","Data":"d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524"} Apr 24 22:11:39.900140 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:39.900103 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" event={"ID":"d06de066-ad28-4e5a-8cb7-fc225af37098","Type":"ContainerStarted","Data":"b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41"} Apr 24 22:11:39.900140 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:39.900143 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" event={"ID":"d06de066-ad28-4e5a-8cb7-fc225af37098","Type":"ContainerStarted","Data":"7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583"} Apr 24 22:11:39.900542 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:39.900219 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:39.924848 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:39.924805 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" podStartSLOduration=3.9247906439999998 podStartE2EDuration="3.924790644s" podCreationTimestamp="2026-04-24 22:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:11:39.922318618 +0000 UTC m=+2624.525387412" watchObservedRunningTime="2026-04-24 22:11:39.924790644 +0000 UTC m=+2624.527859424" Apr 24 22:11:46.954776 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:46.954736 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:46.954776 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:46.954775 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:46.957053 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:46.957025 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:11:47.936125 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:11:47.936099 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:12:08.940017 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:08.939991 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:12:29.191037 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:29.190999 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw"] Apr 24 22:12:29.191544 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:29.191312 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="main" containerID="cri-o://fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20" gracePeriod=30 Apr 24 22:12:29.191544 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:29.191380 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="tokenizer" containerID="cri-o://05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c" gracePeriod=30 Apr 24 22:12:30.077067 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.077020 2568 generic.go:358] "Generic (PLEG): container finished" podID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerID="fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20" exitCode=0 Apr 24 22:12:30.077239 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.077076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" event={"ID":"4af7a02a-4bbe-45be-ab19-c3a4030fb83f","Type":"ContainerDied","Data":"fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20"} Apr 24 22:12:30.336378 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.336325 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 22:12:30.428696 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.428671 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-tmp\") pod \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " Apr 24 22:12:30.428847 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.428702 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-cache\") pod \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " Apr 24 22:12:30.428847 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.428745 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-uds\") pod \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " Apr 24 22:12:30.428847 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.428801 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkpfr\" (UniqueName: \"kubernetes.io/projected/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kube-api-access-tkpfr\") pod \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " Apr 24 22:12:30.428974 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.428868 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tls-certs\") pod \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " Apr 24 22:12:30.428974 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.428896 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kserve-provision-location\") pod \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\" (UID: \"4af7a02a-4bbe-45be-ab19-c3a4030fb83f\") " Apr 24 22:12:30.429094 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.428971 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4af7a02a-4bbe-45be-ab19-c3a4030fb83f" (UID: "4af7a02a-4bbe-45be-ab19-c3a4030fb83f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:30.429094 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.428980 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4af7a02a-4bbe-45be-ab19-c3a4030fb83f" (UID: "4af7a02a-4bbe-45be-ab19-c3a4030fb83f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:30.429094 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.429035 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4af7a02a-4bbe-45be-ab19-c3a4030fb83f" (UID: "4af7a02a-4bbe-45be-ab19-c3a4030fb83f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:30.429271 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.429252 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:12:30.429324 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.429275 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:12:30.429324 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.429293 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:12:30.429573 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.429552 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4af7a02a-4bbe-45be-ab19-c3a4030fb83f" (UID: "4af7a02a-4bbe-45be-ab19-c3a4030fb83f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:30.430997 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.430976 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4af7a02a-4bbe-45be-ab19-c3a4030fb83f" (UID: "4af7a02a-4bbe-45be-ab19-c3a4030fb83f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:12:30.431063 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.430995 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kube-api-access-tkpfr" (OuterVolumeSpecName: "kube-api-access-tkpfr") pod "4af7a02a-4bbe-45be-ab19-c3a4030fb83f" (UID: "4af7a02a-4bbe-45be-ab19-c3a4030fb83f"). InnerVolumeSpecName "kube-api-access-tkpfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:12:30.530462 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.530442 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:12:30.530549 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.530465 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:12:30.530549 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:30.530475 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkpfr\" (UniqueName: \"kubernetes.io/projected/4af7a02a-4bbe-45be-ab19-c3a4030fb83f-kube-api-access-tkpfr\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:12:31.082800 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.082765 2568 generic.go:358] "Generic (PLEG): container finished" podID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerID="05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c" exitCode=0 Apr 24 22:12:31.082988 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.082863 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" Apr 24 22:12:31.082988 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.082856 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" event={"ID":"4af7a02a-4bbe-45be-ab19-c3a4030fb83f","Type":"ContainerDied","Data":"05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c"} Apr 24 22:12:31.082988 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.082972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw" event={"ID":"4af7a02a-4bbe-45be-ab19-c3a4030fb83f","Type":"ContainerDied","Data":"a529e70d08e25d0239d39426527d0ab2a28792bb4f4ecbf770b6ca7a9914d71f"} Apr 24 22:12:31.083133 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.082991 2568 scope.go:117] "RemoveContainer" containerID="05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c" Apr 24 22:12:31.091778 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.091760 2568 scope.go:117] "RemoveContainer" containerID="fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20" Apr 24 22:12:31.098572 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.098554 2568 scope.go:117] "RemoveContainer" containerID="ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9" Apr 24 22:12:31.105663 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.105614 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw"] Apr 24 22:12:31.105741 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.105684 2568 scope.go:117] "RemoveContainer" containerID="05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c" Apr 24 22:12:31.105986 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:12:31.105966 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c\": container with ID starting with 05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c not found: ID does not exist" containerID="05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c" Apr 24 22:12:31.106079 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.105999 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c"} err="failed to get container status \"05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c\": rpc error: code = NotFound desc = could not find container \"05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c\": container with ID starting with 05a7bdc9b8ab09eae72ae3542d0d3ed8c5e2b668067dcab8bb695c971da74b3c not found: ID does not exist" Apr 24 22:12:31.106079 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.106023 2568 scope.go:117] "RemoveContainer" containerID="fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20" Apr 24 22:12:31.106261 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:12:31.106244 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20\": container with ID starting with fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20 not found: ID does not exist" containerID="fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20" Apr 24 22:12:31.106328 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.106268 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20"} err="failed to get container status \"fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20\": rpc error: code = NotFound desc = could not find container \"fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20\": container with ID starting with fac64e0b0d1989847ab1d2055edc5c32fb943316ab1c83dbf23f56bdc53a5d20 not found: ID does not exist" Apr 24 22:12:31.106328 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.106290 2568 scope.go:117] "RemoveContainer" containerID="ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9" Apr 24 22:12:31.106516 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:12:31.106497 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9\": container with ID starting with ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9 not found: ID does not exist" containerID="ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9" Apr 24 22:12:31.106557 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.106522 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9"} err="failed to get container status \"ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9\": rpc error: code = NotFound desc = could not find container \"ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9\": container with ID starting with ea92a5c23482e09cce5d6bbd5fa5d817e8556a5640b7d14dc3826a45201339d9 not found: ID does not exist" Apr 24 22:12:31.110322 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.110303 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schen8tfw"] Apr 24 22:12:31.917467 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:31.917434 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" path="/var/lib/kubelet/pods/4af7a02a-4bbe-45be-ab19-c3a4030fb83f/volumes" Apr 24 22:12:56.003625 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:56.003602 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 22:12:56.010787 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:12:56.010770 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 22:15:48.524061 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:48.524025 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss"] Apr 24 22:15:48.524644 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:48.524588 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="main" containerID="cri-o://7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583" gracePeriod=30 Apr 24 22:15:48.524780 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:48.524725 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="tokenizer" containerID="cri-o://b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41" gracePeriod=30 Apr 24 22:15:48.732547 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:48.732510 2568 generic.go:358] "Generic (PLEG): container finished" podID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerID="7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583" exitCode=0 Apr 24 22:15:48.732773 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:48.732581 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" event={"ID":"d06de066-ad28-4e5a-8cb7-fc225af37098","Type":"ContainerDied","Data":"7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583"} Apr 24 22:15:48.939332 ip-10-0-134-232 kubenswrapper[2568]: W0424 22:15:48.939242 2568 logging.go:55] [core] [Channel #1813 SubChannel #1814]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.44:9003", ServerName: "10.134.0.44:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.44:9003: connect: connection refused" Apr 24 22:15:49.684836 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.684814 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:15:49.737998 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.737956 2568 generic.go:358] "Generic (PLEG): container finished" podID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerID="b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41" exitCode=0 Apr 24 22:15:49.738149 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.738022 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" event={"ID":"d06de066-ad28-4e5a-8cb7-fc225af37098","Type":"ContainerDied","Data":"b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41"} Apr 24 22:15:49.738149 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.738045 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" Apr 24 22:15:49.738149 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.738056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" event={"ID":"d06de066-ad28-4e5a-8cb7-fc225af37098","Type":"ContainerDied","Data":"84a080a5222b13030ae612b58d05e10b0e7417a521f033a31ceaeef741436a57"} Apr 24 22:15:49.738149 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.738079 2568 scope.go:117] "RemoveContainer" containerID="b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41" Apr 24 22:15:49.745938 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.745919 2568 scope.go:117] "RemoveContainer" containerID="7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583" Apr 24 22:15:49.752684 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.752664 2568 scope.go:117] "RemoveContainer" containerID="d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524" Apr 24 22:15:49.759323 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.759306 2568 scope.go:117] "RemoveContainer" containerID="b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41" Apr 24 22:15:49.759568 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:15:49.759546 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41\": container with ID starting with b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41 not found: ID does not exist" containerID="b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41" Apr 24 22:15:49.759676 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.759572 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41"} err="failed to get container status \"b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41\": rpc error: code = NotFound desc = could not find container \"b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41\": container with ID starting with b5db20c2ff74686f3c32d7b5ca07bc9d5b282dfa33309d373599b6f94f905a41 not found: ID does not exist" Apr 24 22:15:49.759676 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.759590 2568 scope.go:117] "RemoveContainer" containerID="7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583" Apr 24 22:15:49.759870 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:15:49.759850 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583\": container with ID starting with 7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583 not found: ID does not exist" containerID="7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583" Apr 24 22:15:49.759916 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.759880 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583"} err="failed to get container status \"7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583\": rpc error: code = NotFound desc = could not find container \"7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583\": container with ID starting with 7edf6052b551a1d5265ad0a47b41c87196d9e2941049c676e3f461af440c7583 not found: ID does not exist" Apr 24 22:15:49.759916 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.759898 2568 scope.go:117] "RemoveContainer" containerID="d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524" Apr 24 22:15:49.760147 ip-10-0-134-232 kubenswrapper[2568]: E0424 22:15:49.760130 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524\": container with ID starting with d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524 not found: ID does not exist" containerID="d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524" Apr 24 22:15:49.760197 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.760151 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524"} err="failed to get container status \"d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524\": rpc error: code = NotFound desc = could not find container \"d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524\": container with ID starting with d36f89dbce3e33f9e4b6fd5c93c1fe1076bd0ef8d1aa7e531f1773f967828524 not found: ID does not exist" Apr 24 22:15:49.807776 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.807719 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-kserve-provision-location\") pod \"d06de066-ad28-4e5a-8cb7-fc225af37098\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " Apr 24 22:15:49.807776 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.807757 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w48pf\" (UniqueName: \"kubernetes.io/projected/d06de066-ad28-4e5a-8cb7-fc225af37098-kube-api-access-w48pf\") pod \"d06de066-ad28-4e5a-8cb7-fc225af37098\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " Apr 24 22:15:49.807776 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.807774 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-cache\") pod \"d06de066-ad28-4e5a-8cb7-fc225af37098\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " Apr 24 22:15:49.807979 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.807846 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-uds\") pod \"d06de066-ad28-4e5a-8cb7-fc225af37098\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " Apr 24 22:15:49.807979 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.807884 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d06de066-ad28-4e5a-8cb7-fc225af37098-tls-certs\") pod \"d06de066-ad28-4e5a-8cb7-fc225af37098\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " Apr 24 22:15:49.807979 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.807912 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-tmp\") pod \"d06de066-ad28-4e5a-8cb7-fc225af37098\" (UID: \"d06de066-ad28-4e5a-8cb7-fc225af37098\") " Apr 24 22:15:49.808176 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.808154 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d06de066-ad28-4e5a-8cb7-fc225af37098" (UID: "d06de066-ad28-4e5a-8cb7-fc225af37098"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:15:49.808176 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.808139 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d06de066-ad28-4e5a-8cb7-fc225af37098" (UID: "d06de066-ad28-4e5a-8cb7-fc225af37098"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:15:49.808292 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.808267 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d06de066-ad28-4e5a-8cb7-fc225af37098" (UID: "d06de066-ad28-4e5a-8cb7-fc225af37098"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:15:49.808488 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.808467 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d06de066-ad28-4e5a-8cb7-fc225af37098" (UID: "d06de066-ad28-4e5a-8cb7-fc225af37098"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:15:49.809840 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.809817 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06de066-ad28-4e5a-8cb7-fc225af37098-kube-api-access-w48pf" (OuterVolumeSpecName: "kube-api-access-w48pf") pod "d06de066-ad28-4e5a-8cb7-fc225af37098" (UID: "d06de066-ad28-4e5a-8cb7-fc225af37098"). InnerVolumeSpecName "kube-api-access-w48pf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:15:49.809944 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.809928 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06de066-ad28-4e5a-8cb7-fc225af37098-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d06de066-ad28-4e5a-8cb7-fc225af37098" (UID: "d06de066-ad28-4e5a-8cb7-fc225af37098"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:15:49.909277 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.909246 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-uds\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:15:49.909277 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.909273 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d06de066-ad28-4e5a-8cb7-fc225af37098-tls-certs\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:15:49.909277 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.909282 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-tmp\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:15:49.909470 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.909292 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-kserve-provision-location\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:15:49.909470 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.909302 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w48pf\" (UniqueName: \"kubernetes.io/projected/d06de066-ad28-4e5a-8cb7-fc225af37098-kube-api-access-w48pf\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:15:49.909470 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.909311 2568 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d06de066-ad28-4e5a-8cb7-fc225af37098-tokenizer-cache\") on node \"ip-10-0-134-232.ec2.internal\" DevicePath \"\"" Apr 24 22:15:49.939348 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:49.939315 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.44:9003\" within 1s: context deadline exceeded" Apr 24 22:15:50.056078 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:50.056048 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss"] Apr 24 22:15:50.060350 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:50.060291 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7cdc4fc86kqzss"] Apr 24 22:15:51.916138 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:15:51.916100 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" path="/var/lib/kubelet/pods/d06de066-ad28-4e5a-8cb7-fc225af37098/volumes" Apr 24 22:16:21.159052 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:21.159025 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jf4qz_3597e4d1-468f-4024-bb70-46b7952accf1/authorino/0.log" Apr 24 22:16:21.178346 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:21.178320 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-4d9ct_dbc35bae-0b55-443b-a605-04ac2db9b369/manager/0.log" Apr 24 22:16:23.668664 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.668616 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wmq6v/must-gather-qrvlz"] Apr 24 22:16:23.669811 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669779 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="storage-initializer" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669822 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="storage-initializer" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669845 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="main" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669854 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="main" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669872 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="tokenizer" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669882 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="tokenizer" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669903 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="main" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669911 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="main" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669944 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="storage-initializer" Apr 24 22:16:23.669955 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669952 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="storage-initializer" Apr 24 22:16:23.670425 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669965 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="tokenizer" Apr 24 22:16:23.670425 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.669973 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="tokenizer" Apr 24 22:16:23.671065 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.671018 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="tokenizer" Apr 24 22:16:23.671065 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.671045 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="main" Apr 24 22:16:23.671065 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.671057 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d06de066-ad28-4e5a-8cb7-fc225af37098" containerName="main" Apr 24 22:16:23.671065 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.671066 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4af7a02a-4bbe-45be-ab19-c3a4030fb83f" containerName="tokenizer" Apr 24 22:16:23.674762 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.674740 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmq6v/must-gather-qrvlz"] Apr 24 22:16:23.674888 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.674848 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmq6v/must-gather-qrvlz" Apr 24 22:16:23.677192 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.677170 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wmq6v\"/\"openshift-service-ca.crt\"" Apr 24 22:16:23.677309 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.677236 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wmq6v\"/\"kube-root-ca.crt\"" Apr 24 22:16:23.677309 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.677300 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wmq6v\"/\"default-dockercfg-4l88c\"" Apr 24 22:16:23.696325 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.696303 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f4ph\" (UniqueName: \"kubernetes.io/projected/409b34e1-4379-44bb-8a28-ae4d11c892fb-kube-api-access-5f4ph\") pod \"must-gather-qrvlz\" (UID: \"409b34e1-4379-44bb-8a28-ae4d11c892fb\") " pod="openshift-must-gather-wmq6v/must-gather-qrvlz" Apr 24 22:16:23.696452 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.696351 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/409b34e1-4379-44bb-8a28-ae4d11c892fb-must-gather-output\") pod \"must-gather-qrvlz\" (UID: \"409b34e1-4379-44bb-8a28-ae4d11c892fb\") " pod="openshift-must-gather-wmq6v/must-gather-qrvlz" Apr 24 22:16:23.797354 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.797328 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/409b34e1-4379-44bb-8a28-ae4d11c892fb-must-gather-output\") pod \"must-gather-qrvlz\" (UID: \"409b34e1-4379-44bb-8a28-ae4d11c892fb\") " pod="openshift-must-gather-wmq6v/must-gather-qrvlz" Apr 24 22:16:23.797536 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.797436 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f4ph\" (UniqueName: \"kubernetes.io/projected/409b34e1-4379-44bb-8a28-ae4d11c892fb-kube-api-access-5f4ph\") pod \"must-gather-qrvlz\" (UID: \"409b34e1-4379-44bb-8a28-ae4d11c892fb\") " pod="openshift-must-gather-wmq6v/must-gather-qrvlz" Apr 24 22:16:23.797746 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.797695 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/409b34e1-4379-44bb-8a28-ae4d11c892fb-must-gather-output\") pod \"must-gather-qrvlz\" (UID: \"409b34e1-4379-44bb-8a28-ae4d11c892fb\") " pod="openshift-must-gather-wmq6v/must-gather-qrvlz" Apr 24 22:16:23.808358 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.808337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f4ph\" (UniqueName: \"kubernetes.io/projected/409b34e1-4379-44bb-8a28-ae4d11c892fb-kube-api-access-5f4ph\") pod \"must-gather-qrvlz\" (UID: \"409b34e1-4379-44bb-8a28-ae4d11c892fb\") " pod="openshift-must-gather-wmq6v/must-gather-qrvlz" Apr 24 22:16:23.985345 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:23.985268 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmq6v/must-gather-qrvlz" Apr 24 22:16:24.112859 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:24.112829 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmq6v/must-gather-qrvlz"] Apr 24 22:16:24.115001 ip-10-0-134-232 kubenswrapper[2568]: W0424 22:16:24.114973 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod409b34e1_4379_44bb_8a28_ae4d11c892fb.slice/crio-5d21a01a3bfb2838e55368904580e582e12f3972002988dadd793ee4e35d678b WatchSource:0}: Error finding container 5d21a01a3bfb2838e55368904580e582e12f3972002988dadd793ee4e35d678b: Status 404 returned error can't find the container with id 5d21a01a3bfb2838e55368904580e582e12f3972002988dadd793ee4e35d678b Apr 24 22:16:24.854107 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:24.854063 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/must-gather-qrvlz" event={"ID":"409b34e1-4379-44bb-8a28-ae4d11c892fb","Type":"ContainerStarted","Data":"5d21a01a3bfb2838e55368904580e582e12f3972002988dadd793ee4e35d678b"} Apr 24 22:16:25.863501 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:25.863462 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/must-gather-qrvlz" event={"ID":"409b34e1-4379-44bb-8a28-ae4d11c892fb","Type":"ContainerStarted","Data":"4f048a6b7ff5da463a8717669797a5c979064be2cdde3755d3c8143f43860352"} Apr 24 22:16:25.863501 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:25.863499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/must-gather-qrvlz" event={"ID":"409b34e1-4379-44bb-8a28-ae4d11c892fb","Type":"ContainerStarted","Data":"47585791e1765be48fd7e165e030208213edb32bc143c85662a77a18dc9d83a5"} Apr 24 22:16:25.885143 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:25.885080 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wmq6v/must-gather-qrvlz" podStartSLOduration=2.04994836 podStartE2EDuration="2.885059396s" podCreationTimestamp="2026-04-24 22:16:23 +0000 UTC" firstStartedPulling="2026-04-24 22:16:24.116777384 +0000 UTC m=+2908.719846150" lastFinishedPulling="2026-04-24 22:16:24.951888428 +0000 UTC m=+2909.554957186" observedRunningTime="2026-04-24 22:16:25.882277464 +0000 UTC m=+2910.485346245" watchObservedRunningTime="2026-04-24 22:16:25.885059396 +0000 UTC m=+2910.488128172" Apr 24 22:16:26.707367 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:26.707333 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pblbn_2e1fa258-3349-404c-99a9-780be75b2a17/global-pull-secret-syncer/0.log" Apr 24 22:16:26.817586 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:26.817405 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jfk4d_057ddaa6-b8e6-4ac0-80db-273cd674b914/konnectivity-agent/0.log" Apr 24 22:16:26.899286 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:26.899234 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-232.ec2.internal_003d7feb7f049475a7b903bcca7b7224/haproxy/0.log" Apr 24 22:16:30.894311 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:30.894237 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jf4qz_3597e4d1-468f-4024-bb70-46b7952accf1/authorino/0.log" Apr 24 22:16:30.928188 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:30.928153 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-4d9ct_dbc35bae-0b55-443b-a605-04ac2db9b369/manager/0.log" Apr 24 22:16:32.073070 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.073037 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1dc2849c-4a7c-4571-a621-370a0e01f551/alertmanager/0.log" Apr 24 22:16:32.111269 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.111236 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1dc2849c-4a7c-4571-a621-370a0e01f551/config-reloader/0.log" Apr 24 22:16:32.147800 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.147773 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1dc2849c-4a7c-4571-a621-370a0e01f551/kube-rbac-proxy-web/0.log" Apr 24 22:16:32.177088 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.176924 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1dc2849c-4a7c-4571-a621-370a0e01f551/kube-rbac-proxy/0.log" Apr 24 22:16:32.203391 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.203364 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1dc2849c-4a7c-4571-a621-370a0e01f551/kube-rbac-proxy-metric/0.log" Apr 24 22:16:32.231079 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.231051 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1dc2849c-4a7c-4571-a621-370a0e01f551/prom-label-proxy/0.log" Apr 24 22:16:32.253264 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.253232 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1dc2849c-4a7c-4571-a621-370a0e01f551/init-config-reloader/0.log" Apr 24 22:16:32.320664 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.320620 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-n9hgt_8092c8bb-c267-4686-9002-7a52d9a90961/kube-state-metrics/0.log" Apr 24 22:16:32.342934 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.342861 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-n9hgt_8092c8bb-c267-4686-9002-7a52d9a90961/kube-rbac-proxy-main/0.log" Apr 24 22:16:32.365282 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.365243 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-n9hgt_8092c8bb-c267-4686-9002-7a52d9a90961/kube-rbac-proxy-self/0.log" Apr 24 22:16:32.394130 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.394097 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7d7c9954dd-2z9dc_a4eb4b4f-7919-4494-8c58-56a18513db6a/metrics-server/0.log" Apr 24 22:16:32.425570 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.425537 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-w8ld8_701fdaca-2cf1-4a21-bb63-c14ca1774495/monitoring-plugin/0.log" Apr 24 22:16:32.568415 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.568378 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k927f_1edf281d-115d-4f99-b5a2-1ad03eedf97d/node-exporter/0.log" Apr 24 22:16:32.599627 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.599564 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k927f_1edf281d-115d-4f99-b5a2-1ad03eedf97d/kube-rbac-proxy/0.log" Apr 24 22:16:32.638874 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.638848 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k927f_1edf281d-115d-4f99-b5a2-1ad03eedf97d/init-textfile/0.log" Apr 24 22:16:32.758464 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.758045 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zg5f5_19d3d197-f883-46f9-95f2-a59482064cd0/kube-rbac-proxy-main/0.log" Apr 24 22:16:32.787237 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.787208 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zg5f5_19d3d197-f883-46f9-95f2-a59482064cd0/kube-rbac-proxy-self/0.log" Apr 24 22:16:32.816743 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.815932 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zg5f5_19d3d197-f883-46f9-95f2-a59482064cd0/openshift-state-metrics/0.log" Apr 24 22:16:32.879873 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.879788 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd913094-9ed8-4431-8f8f-1b6220e14c55/prometheus/0.log" Apr 24 22:16:32.900942 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.900910 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd913094-9ed8-4431-8f8f-1b6220e14c55/config-reloader/0.log" Apr 24 22:16:32.930034 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.929997 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd913094-9ed8-4431-8f8f-1b6220e14c55/thanos-sidecar/0.log" Apr 24 22:16:32.952469 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.952440 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd913094-9ed8-4431-8f8f-1b6220e14c55/kube-rbac-proxy-web/0.log" Apr 24 22:16:32.976131 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:32.976100 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd913094-9ed8-4431-8f8f-1b6220e14c55/kube-rbac-proxy/0.log" Apr 24 22:16:33.002382 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.002348 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd913094-9ed8-4431-8f8f-1b6220e14c55/kube-rbac-proxy-thanos/0.log" Apr 24 22:16:33.028002 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.027974 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cd913094-9ed8-4431-8f8f-1b6220e14c55/init-config-reloader/0.log" Apr 24 22:16:33.153411 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.153334 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bbc6c965b-tzqbm_ab725086-c0f4-4827-ba09-3bcd0a0470e0/telemeter-client/0.log" Apr 24 22:16:33.178274 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.178237 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bbc6c965b-tzqbm_ab725086-c0f4-4827-ba09-3bcd0a0470e0/reload/0.log" Apr 24 22:16:33.200429 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.200391 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bbc6c965b-tzqbm_ab725086-c0f4-4827-ba09-3bcd0a0470e0/kube-rbac-proxy/0.log" Apr 24 22:16:33.235308 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.235275 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64fd45769b-zll9c_3fce3a25-2bba-432f-984f-bedaa6e050c1/thanos-query/0.log" Apr 24 22:16:33.258903 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.258853 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64fd45769b-zll9c_3fce3a25-2bba-432f-984f-bedaa6e050c1/kube-rbac-proxy-web/0.log" Apr 24 22:16:33.282356 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.282330 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64fd45769b-zll9c_3fce3a25-2bba-432f-984f-bedaa6e050c1/kube-rbac-proxy/0.log" Apr 24 22:16:33.308118 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.308089 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64fd45769b-zll9c_3fce3a25-2bba-432f-984f-bedaa6e050c1/prom-label-proxy/0.log" Apr 24 22:16:33.334409 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.334381 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64fd45769b-zll9c_3fce3a25-2bba-432f-984f-bedaa6e050c1/kube-rbac-proxy-rules/0.log" Apr 24 22:16:33.358864 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:33.358832 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64fd45769b-zll9c_3fce3a25-2bba-432f-984f-bedaa6e050c1/kube-rbac-proxy-metrics/0.log" Apr 24 22:16:35.286142 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.286106 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n"] Apr 24 22:16:35.292019 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.291993 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.300285 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.300261 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n"] Apr 24 22:16:35.408031 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.408001 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-proc\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.408031 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.408045 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-lib-modules\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.408235 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.408077 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-podres\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.408235 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.408095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-sys\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.408235 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.408117 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2dsn\" (UniqueName: \"kubernetes.io/projected/fdf69a57-ef20-4ff2-a6de-ad2559b583df-kube-api-access-h2dsn\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.508946 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.508915 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-podres\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.509133 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.508956 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-sys\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.509133 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.508988 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2dsn\" (UniqueName: \"kubernetes.io/projected/fdf69a57-ef20-4ff2-a6de-ad2559b583df-kube-api-access-h2dsn\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.509133 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.509035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-sys\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.509133 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.509068 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-proc\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.509133 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.509093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-podres\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.509133 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.509110 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-lib-modules\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.509369 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.509104 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-proc\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.509369 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.509205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fdf69a57-ef20-4ff2-a6de-ad2559b583df-lib-modules\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.523742 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.523698 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2dsn\" (UniqueName: \"kubernetes.io/projected/fdf69a57-ef20-4ff2-a6de-ad2559b583df-kube-api-access-h2dsn\") pod \"perf-node-gather-daemonset-dvh9n\" (UID: \"fdf69a57-ef20-4ff2-a6de-ad2559b583df\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.605862 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.605752 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.761966 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.761918 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n"] Apr 24 22:16:35.764747 ip-10-0-134-232 kubenswrapper[2568]: W0424 22:16:35.764724 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfdf69a57_ef20_4ff2_a6de_ad2559b583df.slice/crio-ac1ea6bf6a7c792f0126175c145b081328ff434d974072509c5f09846899d78c WatchSource:0}: Error finding container ac1ea6bf6a7c792f0126175c145b081328ff434d974072509c5f09846899d78c: Status 404 returned error can't find the container with id ac1ea6bf6a7c792f0126175c145b081328ff434d974072509c5f09846899d78c Apr 24 22:16:35.906310 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.906224 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" event={"ID":"fdf69a57-ef20-4ff2-a6de-ad2559b583df","Type":"ContainerStarted","Data":"f1de8ef0fb29d7646b912e8898c3143f61d170975c12fc5ed0ef11fb3e40cd21"} Apr 24 22:16:35.906310 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.906265 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" event={"ID":"fdf69a57-ef20-4ff2-a6de-ad2559b583df","Type":"ContainerStarted","Data":"ac1ea6bf6a7c792f0126175c145b081328ff434d974072509c5f09846899d78c"} Apr 24 22:16:35.906310 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.906298 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:35.926611 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:35.926566 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" podStartSLOduration=0.926550325 podStartE2EDuration="926.550325ms" podCreationTimestamp="2026-04-24 22:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:16:35.924043106 +0000 UTC m=+2920.527111886" watchObservedRunningTime="2026-04-24 22:16:35.926550325 +0000 UTC m=+2920.529619107" Apr 24 22:16:36.945682 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:36.945651 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rbrfl_7fe12bc3-3098-4d2a-bf02-8982253438e3/dns/0.log" Apr 24 22:16:36.967074 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:36.967050 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rbrfl_7fe12bc3-3098-4d2a-bf02-8982253438e3/kube-rbac-proxy/0.log" Apr 24 22:16:36.993995 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:36.993972 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9v87r_66b6cbbe-44e0-47d7-8578-ce2ea2980a91/dns-node-resolver/0.log" Apr 24 22:16:37.523601 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:37.523566 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8fpvh_6406f30a-30e3-4227-9670-db1cb68f44b9/node-ca/0.log" Apr 24 22:16:38.978631 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:38.978592 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-whtpb_b82dd24e-152b-4750-951d-1506b5854df1/serve-healthcheck-canary/0.log" Apr 24 22:16:39.596994 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:39.596961 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vqxb9_2bdbfe10-ce76-4858-b4e6-d9443744ee5d/kube-rbac-proxy/0.log" Apr 24 22:16:39.617982 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:39.617961 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vqxb9_2bdbfe10-ce76-4858-b4e6-d9443744ee5d/exporter/0.log" Apr 24 22:16:39.640421 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:39.640394 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vqxb9_2bdbfe10-ce76-4858-b4e6-d9443744ee5d/extractor/0.log" Apr 24 22:16:41.920401 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:41.920370 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-dvh9n" Apr 24 22:16:42.234490 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:42.234421 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5774f66dc9-rt46l_dedbecbe-3e87-4e0c-b877-3e8d7b56d8db/manager/0.log" Apr 24 22:16:42.790952 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:42.790912 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-67f77cd7d7-ffx4r_ec0d4508-2f84-4216-af5f-dd2e9f0cd920/manager/0.log" Apr 24 22:16:49.543802 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:49.543772 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4pwtf_05e6f705-e1bb-4e36-a24a-612ad7cf0c56/kube-multus/0.log" Apr 24 22:16:49.739065 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:49.739031 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9kdl_4608f5de-7826-4605-82e0-fc8f5d0e4830/kube-multus-additional-cni-plugins/0.log" Apr 24 22:16:49.762949 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:49.762923 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9kdl_4608f5de-7826-4605-82e0-fc8f5d0e4830/egress-router-binary-copy/0.log" Apr 24 22:16:49.785089 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:49.785065 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9kdl_4608f5de-7826-4605-82e0-fc8f5d0e4830/cni-plugins/0.log" Apr 24 22:16:49.806457 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:49.806404 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9kdl_4608f5de-7826-4605-82e0-fc8f5d0e4830/bond-cni-plugin/0.log" Apr 24 22:16:49.827776 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:49.827756 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9kdl_4608f5de-7826-4605-82e0-fc8f5d0e4830/routeoverride-cni/0.log" Apr 24 22:16:49.852580 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:49.852559 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9kdl_4608f5de-7826-4605-82e0-fc8f5d0e4830/whereabouts-cni-bincopy/0.log" Apr 24 22:16:49.876355 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:49.876333 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9kdl_4608f5de-7826-4605-82e0-fc8f5d0e4830/whereabouts-cni/0.log" Apr 24 22:16:50.226632 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:50.226551 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xrlcl_7ca2ae96-23c0-4771-ba4d-46f95e147eb7/network-metrics-daemon/0.log" Apr 24 22:16:50.249445 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:50.249412 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xrlcl_7ca2ae96-23c0-4771-ba4d-46f95e147eb7/kube-rbac-proxy/0.log" Apr 24 22:16:51.343459 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.343418 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-controller/0.log" Apr 24 22:16:51.360925 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.360899 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/0.log" Apr 24 22:16:51.387400 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.387372 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovn-acl-logging/1.log" Apr 24 22:16:51.407839 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.407817 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/kube-rbac-proxy-node/0.log" Apr 24 22:16:51.429174 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.429152 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:16:51.450418 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.450397 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/northd/0.log" Apr 24 22:16:51.471470 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.471434 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/nbdb/0.log" Apr 24 22:16:51.492921 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.492899 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/sbdb/0.log" Apr 24 22:16:51.686611 ip-10-0-134-232 kubenswrapper[2568]: I0424 22:16:51.686582 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c6xp2_6639d316-2e21-49c5-baad-539d3602282a/ovnkube-controller/0.log"