Apr 24 21:24:12.846623 ip-10-0-133-73 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:24:12.846637 ip-10-0-133-73 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:24:12.846647 ip-10-0-133-73 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:24:12.846898 ip-10-0-133-73 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:24:23.089256 ip-10-0-133-73 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:24:23.089272 ip-10-0-133-73 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 549a83cb605047ccb2e06118ad927c47 -- Apr 24 21:26:28.632017 ip-10-0-133-73 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:26:29.136984 ip-10-0-133-73 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:29.136984 ip-10-0-133-73 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:26:29.136984 ip-10-0-133-73 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:29.136984 ip-10-0-133-73 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:26:29.136984 ip-10-0-133-73 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:29.138096 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.137810 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:26:29.141201 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141184 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:29.141201 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141200 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141205 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141208 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141211 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141213 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141216 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141219 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141221 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141226 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141231 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141234 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141237 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141239 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141242 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141245 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141253 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141256 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141259 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141261 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:29.141275 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141264 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141266 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141269 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141271 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141273 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141276 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141279 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141282 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141284 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141287 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141290 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141292 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141295 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141297 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141300 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141302 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141305 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141307 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141310 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:29.141719 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141313 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141315 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141317 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141320 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141322 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141324 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141327 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141330 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141332 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141335 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141338 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141340 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141343 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141345 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141347 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141350 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141352 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141355 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141357 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141359 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:29.142154 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141361 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141364 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141366 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141369 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141371 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141373 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141376 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141378 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141380 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141383 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141385 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141388 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141390 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141392 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141396 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141398 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141402 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141405 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141407 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:29.142610 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141411 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141413 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141415 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141418 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141420 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141422 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141425 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141427 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141824 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141830 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141833 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141836 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141839 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141842 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141845 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141847 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141850 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141852 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141854 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141857 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:29.143127 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141859 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141862 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141864 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141867 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141869 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141872 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141874 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141877 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141879 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141881 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141884 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141886 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141889 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141891 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141894 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141896 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141899 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141902 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141904 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141907 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:29.143850 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141909 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141912 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141915 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141917 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141919 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141922 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141924 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141928 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141930 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141933 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141935 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141937 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141939 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141942 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141944 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141948 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141951 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141954 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141957 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:29.144375 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141960 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141964 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141966 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141969 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141972 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141974 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141977 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141979 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141982 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141984 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141987 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141989 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141992 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141994 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141996 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.141998 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142001 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142003 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142006 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142008 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:29.144852 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142010 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142014 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142016 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142018 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142021 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142023 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142025 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142028 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142030 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142032 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142035 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142037 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142040 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142042 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.142044 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142692 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142701 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142707 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142718 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142722 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142726 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:26:29.145325 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142730 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142734 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142738 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142741 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142745 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142749 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142751 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142754 2569 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142757 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142760 2569 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142763 2569 flags.go:64] FLAG: --cloud-config="" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142766 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142768 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142841 2569 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142848 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142852 2569 flags.go:64] FLAG: --config-dir="" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142856 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142859 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142864 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142868 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142871 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142880 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142884 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142887 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:26:29.145846 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142890 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142896 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142901 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142908 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142913 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142922 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.142926 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143032 2569 flags.go:64] FLAG: --enable-server="true" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143040 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143051 2569 flags.go:64] FLAG: --event-burst="100" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143058 2569 flags.go:64] FLAG: --event-qps="50" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143064 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143070 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143075 2569 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143082 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143094 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143099 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143104 2569 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143109 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143114 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143119 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143124 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143130 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143140 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143145 2569 flags.go:64] FLAG: --feature-gates="" Apr 24 21:26:29.146408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143152 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143156 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143162 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143167 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143172 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143177 2569 flags.go:64] FLAG: --help="false" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143181 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143192 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143196 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143201 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143207 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143213 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143217 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143222 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143231 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143242 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143248 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143253 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143257 2569 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143262 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143267 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143272 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143277 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143282 2569 flags.go:64] FLAG: --lock-file="" Apr 24 21:26:29.147012 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143292 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143297 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143302 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143311 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143315 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143321 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143325 2569 flags.go:64] FLAG: --logging-format="text" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143330 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143340 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143345 2569 flags.go:64] FLAG: --manifest-url="" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143349 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143357 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143362 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143369 2569 flags.go:64] FLAG: --max-pods="110" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143374 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143379 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143384 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143393 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143398 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143403 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143409 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143423 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143432 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143437 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:26:29.147571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143448 2569 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143452 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143460 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143464 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143469 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143474 2569 flags.go:64] FLAG: --port="10250" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143479 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143483 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02fe6a9c412a49c8f" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143492 2569 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143497 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143502 2569 flags.go:64] FLAG: --register-node="true" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143507 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143511 2569 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143516 2569 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143521 2569 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143526 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143531 2569 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143542 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143547 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143551 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143556 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143561 2569 flags.go:64] FLAG: --runonce="false" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143565 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143570 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143575 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:26:29.148198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143584 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143589 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143593 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143598 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143603 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143610 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143616 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143621 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143626 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143636 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143641 2569 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143645 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143656 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143660 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143664 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143696 2569 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143701 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143705 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143709 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143714 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143719 2569 flags.go:64] FLAG: --v="2" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143725 2569 flags.go:64] FLAG: --version="false" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143864 2569 flags.go:64] FLAG: --vmodule="" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143885 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.143891 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:26:29.148816 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144124 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144131 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144135 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144138 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144141 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144144 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144146 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144149 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144152 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144154 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144157 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144159 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144162 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144165 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144167 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144170 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144172 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144174 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144177 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:29.149428 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144179 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144182 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144184 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144187 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144189 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144192 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144195 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144197 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144199 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144202 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144204 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144207 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144210 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144212 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144215 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144217 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144220 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144222 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144225 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144227 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:29.149931 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144230 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144233 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144237 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144239 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144242 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144245 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144247 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144250 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144252 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144254 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144257 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144259 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144262 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144264 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144266 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144269 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144271 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144274 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144276 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144279 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:29.150425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144281 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144283 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144286 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144295 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144299 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144301 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144305 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144307 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144310 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144313 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144315 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144318 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144320 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144323 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144325 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144328 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144330 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144333 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144335 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144337 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:29.150915 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144340 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144342 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144345 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144347 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144350 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144352 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.144355 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.144363 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.150924 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.150940 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.150987 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.150992 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.150995 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.150998 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151001 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:29.151379 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151003 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151006 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151009 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151012 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151015 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151017 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151020 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151022 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151025 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151027 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151030 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151032 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151035 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151037 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151039 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151042 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151044 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151047 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151050 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:29.151822 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151054 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151058 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151061 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151064 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151067 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151070 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151073 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151076 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151079 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151082 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151084 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151087 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151089 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151091 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151094 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151096 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151103 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151106 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151108 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151111 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:29.152304 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151113 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151116 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151118 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151120 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151123 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151125 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151128 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151130 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151132 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151135 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151137 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151140 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151143 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151145 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151147 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151150 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151152 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151155 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151158 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:29.152799 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151161 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151165 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151168 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151170 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151172 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151175 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151177 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151179 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151182 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151184 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151187 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151190 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151192 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151195 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151197 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151200 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151203 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151205 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151207 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151210 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:29.153257 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151212 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151214 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151217 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.151222 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151312 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151317 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151320 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151323 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151325 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151328 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151330 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151333 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151336 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151338 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151341 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:29.153750 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151343 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151345 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151348 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151350 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151352 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151355 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151357 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151359 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151362 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151365 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151367 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151369 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151372 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151374 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151376 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151378 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151381 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151383 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151386 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151388 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:29.154108 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151390 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151393 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151396 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151398 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151401 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151403 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151405 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151408 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151410 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151414 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151417 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151420 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151422 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151424 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151427 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151429 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151431 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151434 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151436 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:29.154576 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151438 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151440 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151443 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151445 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151448 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151450 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151452 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151454 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151457 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151459 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151461 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151465 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151468 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151471 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151474 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151477 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151480 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151482 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151485 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:29.155033 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151488 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151490 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151493 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151495 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151497 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151500 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151502 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151504 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151507 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151509 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151511 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151513 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151516 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151518 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151521 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151523 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:29.155470 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:29.151525 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:29.155852 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.151530 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:29.155852 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.152352 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:26:29.161063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.161049 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:26:29.162184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.162174 2569 server.go:1019] "Starting client certificate rotation" Apr 24 21:26:29.162286 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.162271 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:29.162315 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.162310 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:29.192353 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.192336 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:29.194944 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.194928 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:29.212043 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.212025 2569 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:26:29.218163 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.218145 2569 log.go:25] "Validated CRI v1 image API" Apr 24 21:26:29.220086 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.220072 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:26:29.221610 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.221593 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:29.222947 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.222930 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9766a51c-164f-4a6c-8f9d-f4ca827292b9:/dev/nvme0n1p3 b0e22869-c1bb-446a-b5c5-49b56ae94b49:/dev/nvme0n1p4] Apr 24 21:26:29.222996 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.222948 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:26:29.227964 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.227864 2569 manager.go:217] Machine: {Timestamp:2026-04-24 21:26:29.226579988 +0000 UTC m=+0.466933239 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3213357 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec202a38d726bfd347775cf1ad7ebdf5 SystemUUID:ec202a38-d726-bfd3-4777-5cf1ad7ebdf5 BootID:549a83cb-6050-47cc-b2e0-6118ad927c47 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e8:da:89:79:83 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e8:da:89:79:83 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:46:d1:22:9a:b5:e5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:26:29.227964 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.227961 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:26:29.228067 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.228056 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:26:29.231648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.231625 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:26:29.231809 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.231651 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-73.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:26:29.231850 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.231819 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:26:29.231850 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.231828 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:26:29.231850 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.231844 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:29.231932 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.231861 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:29.233259 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.233248 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:29.233367 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.233358 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:26:29.236580 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.236570 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:26:29.236611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.236590 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:26:29.236611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.236603 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:26:29.236665 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.236612 2569 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:26:29.236665 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.236628 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:26:29.237871 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.237856 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:29.237948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.237881 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:29.237948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.237918 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-47wnj" Apr 24 21:26:29.241156 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.241139 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:26:29.242709 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.242695 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:26:29.243602 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243583 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:26:29.243602 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243605 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243615 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243620 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243625 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243631 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243637 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243642 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243650 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243656 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:26:29.243748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243683 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:26:29.243966 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.243943 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:26:29.245105 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.245094 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:26:29.245105 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.245106 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:26:29.247790 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.247773 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-47wnj" Apr 24 21:26:29.248684 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.248658 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:26:29.248731 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.248707 2569 server.go:1295] "Started kubelet" Apr 24 21:26:29.248819 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.248793 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:26:29.248858 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.248810 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:26:29.248887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.248876 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:26:29.249525 ip-10-0-133-73 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:26:29.250051 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.250036 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-73.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:26:29.250108 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.250094 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-73.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:26:29.250163 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.250114 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:26:29.250163 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.250139 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:26:29.256548 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.256524 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:26:29.259430 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.259399 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:26:29.259430 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.259414 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:29.259945 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.259927 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:26:29.259945 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.259947 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:26:29.260076 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.259967 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:26:29.260076 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260032 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:26:29.260076 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260044 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:26:29.260206 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260143 2569 factory.go:55] Registering systemd factory Apr 24 21:26:29.260206 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260173 2569 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:26:29.260401 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260384 2569 factory.go:153] Registering CRI-O factory Apr 24 21:26:29.260401 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260403 2569 factory.go:223] Registration of the crio container factory successfully Apr 24 21:26:29.260542 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.260419 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:29.260542 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260457 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:26:29.260542 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260483 2569 factory.go:103] Registering Raw factory Apr 24 21:26:29.260542 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.260499 2569 manager.go:1196] Started watching for new ooms in manager Apr 24 21:26:29.261250 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.261023 2569 manager.go:319] Starting recovery of all containers Apr 24 21:26:29.262095 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.262073 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:29.263032 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.263010 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:26:29.264734 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.264712 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-73.ec2.internal\" not found" node="ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.271148 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.271135 2569 manager.go:324] Recovery completed Apr 24 21:26:29.274978 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.274965 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:29.277555 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.277538 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:29.277617 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.277569 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:29.277617 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.277581 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:29.278082 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.278069 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:26:29.278135 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.278082 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:26:29.278135 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.278100 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:29.281725 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.281663 2569 policy_none.go:49] "None policy: Start" Apr 24 21:26:29.281802 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.281731 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:26:29.281802 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.281745 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:26:29.320239 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.320222 2569 manager.go:341] "Starting Device Plugin manager" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.320252 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.320265 2569 server.go:85] "Starting device plugin registration server" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.320488 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.320499 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.320601 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.320687 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.320696 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.321166 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.321198 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.324893 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.326488 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.327154 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.327190 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.327202 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.327243 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:26:29.340045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.329146 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:29.421254 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.421173 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:29.422086 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.422071 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:29.422149 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.422111 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:29.422149 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.422122 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:29.422149 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.422143 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.427855 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.427839 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal"] Apr 24 21:26:29.427912 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.427904 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:29.429737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.429723 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:29.429794 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.429749 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:29.429794 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.429758 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:29.430161 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.430148 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.430192 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.430167 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-73.ec2.internal\": node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:29.431841 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.431829 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:29.431979 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.431967 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.432013 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.431992 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:29.432538 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.432515 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:29.432538 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.432524 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:29.432538 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.432542 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:29.432759 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.432545 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:29.432759 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.432554 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:29.432759 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.432562 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:29.434607 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.434590 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.434713 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.434622 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:29.435231 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.435218 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:29.435297 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.435246 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:29.435297 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.435259 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:29.440830 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.440811 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:29.458204 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.458150 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-73.ec2.internal\" not found" node="ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.461181 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.461161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.461251 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.461196 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.461251 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.461232 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d74e5b7f3ca859862ed6413284694748-config\") pod \"kube-apiserver-proxy-ip-10-0-133-73.ec2.internal\" (UID: \"d74e5b7f3ca859862ed6413284694748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.462277 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.462262 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-73.ec2.internal\" not found" node="ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.541141 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.541117 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:29.562228 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.562210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.562297 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.562232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.562297 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.562249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d74e5b7f3ca859862ed6413284694748-config\") pod \"kube-apiserver-proxy-ip-10-0-133-73.ec2.internal\" (UID: \"d74e5b7f3ca859862ed6413284694748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.562369 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.562296 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.562369 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.562310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d74e5b7f3ca859862ed6413284694748-config\") pod \"kube-apiserver-proxy-ip-10-0-133-73.ec2.internal\" (UID: \"d74e5b7f3ca859862ed6413284694748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.562369 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.562310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48fe24b0efea27e3f5eca2d71913b3e7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal\" (UID: \"48fe24b0efea27e3f5eca2d71913b3e7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.642149 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.642114 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:29.742920 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.742865 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:29.760031 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.760011 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.765026 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:29.765009 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 21:26:29.843718 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.843666 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:29.944096 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:29.944072 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:30.044542 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.044517 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-73.ec2.internal\" not found" Apr 24 21:26:30.090580 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.090557 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:30.160683 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.160651 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" Apr 24 21:26:30.161759 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.161746 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:26:30.161868 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.161854 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:30.161921 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.161901 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:30.161921 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.161903 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:30.177874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.177854 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:30.179583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.179571 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" Apr 24 21:26:30.189652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.189630 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:30.236852 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.236814 2569 apiserver.go:52] "Watching apiserver" Apr 24 21:26:30.244748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.244729 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:26:30.246875 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.246853 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-g5hcg","kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal","openshift-cluster-node-tuning-operator/tuned-kg22p","openshift-multus/multus-additional-cni-plugins-l64cf","openshift-multus/multus-csnv2","openshift-ovn-kubernetes/ovnkube-node-jgb8f","kube-system/konnectivity-agent-6bv5v","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7","openshift-image-registry/node-ca-9pchk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal","openshift-multus/network-metrics-daemon-7dwzn","openshift-network-diagnostics/network-check-target-bg9mt"] Apr 24 21:26:30.249735 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.249709 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:21:29 +0000 UTC" deadline="2027-12-22 23:57:28.544143056 +0000 UTC" Apr 24 21:26:30.249735 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.249732 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14570h30m58.294413081s" Apr 24 21:26:30.252701 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.252663 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.252986 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.252830 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.255689 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.255656 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:30.255777 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.255661 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:30.255777 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.255727 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:30.255777 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.255698 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:26:30.255777 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.255752 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kjl5r\"" Apr 24 21:26:30.255777 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.255753 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f45lh\"" Apr 24 21:26:30.256006 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.255890 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:30.257660 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.257644 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.258650 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.258155 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.259645 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.259624 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:30.260276 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.260255 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:26:30.260658 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.260643 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f6cv6\"" Apr 24 21:26:30.261139 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.261122 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4ldgr\"" Apr 24 21:26:30.261223 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.261180 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:26:30.261920 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.261890 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.262336 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.262320 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:26:30.262419 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.262334 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:26:30.262419 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.262356 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:26:30.262419 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.262336 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:26:30.264045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264028 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:30.264328 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264307 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-run\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.264407 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264344 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-tuned\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.264407 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264364 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-kubelet\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.264476 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-slash\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.264476 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264440 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mqt\" (UniqueName: \"kubernetes.io/projected/a34653fe-8931-4a96-adb4-518f9c93a246-kube-api-access-g9mqt\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.264476 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-run-ovn-kubernetes\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.264569 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264492 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2bd960cb-d43f-43af-84ee-6e0693fdb2da-iptables-alerter-script\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.264569 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264510 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-system-cni-dir\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.264569 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264524 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-cni-binary-copy\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.264569 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264537 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.264569 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264554 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-socket-dir-parent\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264584 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-daemon-config\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-systemd\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264618 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-ovn\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264632 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-system-cni-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-hostroot\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264732 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-cni-bin\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-cnibin\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d47b5659-b736-4e8d-abe4-3cee234ead85-cni-binary-copy\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.264796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264790 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-cni-bin\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264810 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-kubelet\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-env-overrides\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264836 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/376860d0-150d-43d2-ba76-6f4bd2a03019-tmp\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264851 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-ovnkube-script-lib\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264878 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-systemd-units\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264890 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-etc-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264911 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-node-log\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264925 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8rtv\" (UniqueName: \"kubernetes.io/projected/f94d99f1-b1b1-4885-b23c-789c312e3426-kube-api-access-m8rtv\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264958 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-modprobe-d\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.264986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-kubernetes\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265001 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysctl-conf\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-multus-certs\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265031 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-run-netns\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-systemd\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-host\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysconfig\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265120 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-os-release\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-netns\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265167 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265188 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-sys\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265207 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4zlh\" (UniqueName: \"kubernetes.io/projected/376860d0-150d-43d2-ba76-6f4bd2a03019-kube-api-access-n4zlh\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265221 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-k8s-cni-cncf-io\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265234 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-cni-netd\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265254 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nv6x\" (UniqueName: \"kubernetes.io/projected/2bd960cb-d43f-43af-84ee-6e0693fdb2da-kube-api-access-2nv6x\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-cni-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265295 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-conf-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265307 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-etc-kubernetes\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265320 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysctl-d\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-ovnkube-config\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265357 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-cnibin\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.265611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265370 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-var-lib-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265384 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f94d99f1-b1b1-4885-b23c-789c312e3426-ovn-node-metrics-cert\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265403 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bd960cb-d43f-43af-84ee-6e0693fdb2da-host-slash\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-var-lib-kubelet\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265455 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-os-release\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265487 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-cni-multus\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265514 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265538 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-log-socket\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-lib-modules\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.266275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.265579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhsmp\" (UniqueName: \"kubernetes.io/projected/d47b5659-b736-4e8d-abe4-3cee234ead85-kube-api-access-lhsmp\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.266861 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.266353 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.266861 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.266711 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:26:30.266861 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.266851 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:26:30.266979 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.266948 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:26:30.266979 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.266954 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pdvjd\"" Apr 24 21:26:30.268008 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.267976 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:26:30.268087 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.268016 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:26:30.268087 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.268032 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:26:30.268087 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.268037 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:26:30.268217 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.268109 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:26:30.268376 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.268364 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-m6fp9\"" Apr 24 21:26:30.268619 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.268607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.268974 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.268958 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:26:30.269090 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.268958 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-tc6vd\"" Apr 24 21:26:30.269197 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.269184 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:26:30.270906 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.270890 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:30.270987 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.270947 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:30.271144 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.271131 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:26:30.272365 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.272351 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:26:30.272462 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.272391 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:26:30.272648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.272633 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:26:30.272982 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.272968 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-cklfj\"" Apr 24 21:26:30.273119 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.273103 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:30.273164 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.273153 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:30.286005 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.285987 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:30.294315 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.294288 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fe24b0efea27e3f5eca2d71913b3e7.slice/crio-ba3aab500940dc2501e4bb09ef4c30be4deedcea35e2cc4ee17c9d1c8b37a513 WatchSource:0}: Error finding container ba3aab500940dc2501e4bb09ef4c30be4deedcea35e2cc4ee17c9d1c8b37a513: Status 404 returned error can't find the container with id ba3aab500940dc2501e4bb09ef4c30be4deedcea35e2cc4ee17c9d1c8b37a513 Apr 24 21:26:30.294595 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.294543 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74e5b7f3ca859862ed6413284694748.slice/crio-2a335c587cff889d8b381a2d9095d72fa537f7754c065999891fb8276c1c6ec6 WatchSource:0}: Error finding container 2a335c587cff889d8b381a2d9095d72fa537f7754c065999891fb8276c1c6ec6: Status 404 returned error can't find the container with id 2a335c587cff889d8b381a2d9095d72fa537f7754c065999891fb8276c1c6ec6 Apr 24 21:26:30.300214 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.300198 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:26:30.311583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.311558 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-btnbg" Apr 24 21:26:30.321071 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.320972 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-btnbg" Apr 24 21:26:30.329911 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.329871 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" event={"ID":"d74e5b7f3ca859862ed6413284694748","Type":"ContainerStarted","Data":"2a335c587cff889d8b381a2d9095d72fa537f7754c065999891fb8276c1c6ec6"} Apr 24 21:26:30.330865 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.330841 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" event={"ID":"48fe24b0efea27e3f5eca2d71913b3e7","Type":"ContainerStarted","Data":"ba3aab500940dc2501e4bb09ef4c30be4deedcea35e2cc4ee17c9d1c8b37a513"} Apr 24 21:26:30.360803 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.360775 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:26:30.365923 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.365905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-etc-kubernetes\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.366014 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.365928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysctl-d\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.366014 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.365946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-ovnkube-config\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.366014 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.365966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eeb650ea-cab6-4757-8b65-a0b656f23baf-agent-certs\") pod \"konnectivity-agent-6bv5v\" (UID: \"eeb650ea-cab6-4757-8b65-a0b656f23baf\") " pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:30.366146 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-etc-kubernetes\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.366146 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-cnibin\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.366146 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-var-lib-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.366146 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366106 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f94d99f1-b1b1-4885-b23c-789c312e3426-ovn-node-metrics-cert\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.366146 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366126 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-cnibin\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.366146 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366130 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-sys-fs\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366159 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysctl-d\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/907a584d-065c-4c42-a7ff-3db1f2519bf9-serviceca\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366167 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-var-lib-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366193 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bd960cb-d43f-43af-84ee-6e0693fdb2da-host-slash\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-var-lib-kubelet\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-os-release\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366235 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2bd960cb-d43f-43af-84ee-6e0693fdb2da-host-slash\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-cni-multus\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366277 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-var-lib-kubelet\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-log-socket\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366308 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-os-release\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-cni-multus\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366325 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzpkn\" (UniqueName: \"kubernetes.io/projected/a63cbdf9-50e4-4a41-84c1-7803058b65cb-kube-api-access-fzpkn\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366356 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-lib-modules\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-log-socket\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.366414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366389 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhsmp\" (UniqueName: \"kubernetes.io/projected/d47b5659-b736-4e8d-abe4-3cee234ead85-kube-api-access-lhsmp\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eeb650ea-cab6-4757-8b65-a0b656f23baf-konnectivity-ca\") pod \"konnectivity-agent-6bv5v\" (UID: \"eeb650ea-cab6-4757-8b65-a0b656f23baf\") " pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366435 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-lib-modules\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366444 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-run\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366463 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-tuned\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366470 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366478 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-kubelet\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366515 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-ovnkube-config\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-slash\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-run\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366548 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mqt\" (UniqueName: \"kubernetes.io/projected/a34653fe-8931-4a96-adb4-518f9c93a246-kube-api-access-g9mqt\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-kubelet\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-slash\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366883 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-run-ovn-kubernetes\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366906 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/907a584d-065c-4c42-a7ff-3db1f2519bf9-host\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366935 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-run-ovn-kubernetes\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366947 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2bd960cb-d43f-43af-84ee-6e0693fdb2da-iptables-alerter-script\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.367145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-system-cni-dir\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.366988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-cni-binary-copy\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367004 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367021 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-socket-dir-parent\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367043 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-daemon-config\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-systemd\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367080 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-system-cni-dir\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367088 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-ovn\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-socket-dir-parent\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367125 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-ovn\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-run-systemd\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-system-cni-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367217 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-hostroot\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-cni-bin\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367263 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-cnibin\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d47b5659-b736-4e8d-abe4-3cee234ead85-cni-binary-copy\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-cni-bin\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-kubelet\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367402 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-env-overrides\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6dx\" (UniqueName: \"kubernetes.io/projected/907a584d-065c-4c42-a7ff-3db1f2519bf9-kube-api-access-7n6dx\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367457 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/376860d0-150d-43d2-ba76-6f4bd2a03019-tmp\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367481 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-ovnkube-script-lib\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-systemd-units\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-etc-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-node-log\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367602 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8rtv\" (UniqueName: \"kubernetes.io/projected/f94d99f1-b1b1-4885-b23c-789c312e3426-kube-api-access-m8rtv\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-device-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-modprobe-d\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-kubernetes\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysctl-conf\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-multus-certs\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.368815 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-cni-binary-copy\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367766 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-daemon-config\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367787 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-run-netns\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-node-log\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367816 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367852 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-systemd\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367890 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-host\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-socket-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysconfig\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-os-release\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-netns\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2bd960cb-d43f-43af-84ee-6e0693fdb2da-iptables-alerter-script\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.367750 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368117 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-modprobe-d\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysctl-conf\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368183 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-cni-bin\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.369528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368193 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-system-cni-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368197 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a34653fe-8931-4a96-adb4-518f9c93a246-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368223 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-cnibin\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368231 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-systemd\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368250 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-systemd-units\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368268 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-sysconfig\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368301 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-hostroot\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368343 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-netns\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-host\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368413 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a34653fe-8931-4a96-adb4-518f9c93a246-os-release\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368374 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-registration-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368455 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-kubelet\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368472 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6n7l\" (UniqueName: \"kubernetes.io/projected/7ac4e3b6-523e-4f41-98be-ceb879813ac3-kube-api-access-z6n7l\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368485 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-etc-openvswitch\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-sys\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368512 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4zlh\" (UniqueName: \"kubernetes.io/projected/376860d0-150d-43d2-ba76-6f4bd2a03019-kube-api-access-n4zlh\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.370049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-kubernetes\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d47b5659-b736-4e8d-abe4-3cee234ead85-cni-binary-copy\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368598 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-k8s-cni-cncf-io\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-multus-certs\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368624 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-cni-netd\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368647 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/376860d0-150d-43d2-ba76-6f4bd2a03019-sys\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368708 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-ovnkube-script-lib\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nv6x\" (UniqueName: \"kubernetes.io/projected/2bd960cb-d43f-43af-84ee-6e0693fdb2da-kube-api-access-2nv6x\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368798 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-run-k8s-cni-cncf-io\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368827 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-cni-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368855 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-conf-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-host-var-lib-cni-bin\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-cni-netd\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368935 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-conf-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.368976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f94d99f1-b1b1-4885-b23c-789c312e3426-host-run-netns\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.369013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d47b5659-b736-4e8d-abe4-3cee234ead85-multus-cni-dir\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.370477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.369040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f94d99f1-b1b1-4885-b23c-789c312e3426-env-overrides\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370950 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.369748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f94d99f1-b1b1-4885-b23c-789c312e3426-ovn-node-metrics-cert\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.370950 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.369764 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/376860d0-150d-43d2-ba76-6f4bd2a03019-etc-tuned\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.370950 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.369852 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/376860d0-150d-43d2-ba76-6f4bd2a03019-tmp\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.380418 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.380399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4zlh\" (UniqueName: \"kubernetes.io/projected/376860d0-150d-43d2-ba76-6f4bd2a03019-kube-api-access-n4zlh\") pod \"tuned-kg22p\" (UID: \"376860d0-150d-43d2-ba76-6f4bd2a03019\") " pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.381542 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.381524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8rtv\" (UniqueName: \"kubernetes.io/projected/f94d99f1-b1b1-4885-b23c-789c312e3426-kube-api-access-m8rtv\") pod \"ovnkube-node-jgb8f\" (UID: \"f94d99f1-b1b1-4885-b23c-789c312e3426\") " pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.381687 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.381654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhsmp\" (UniqueName: \"kubernetes.io/projected/d47b5659-b736-4e8d-abe4-3cee234ead85-kube-api-access-lhsmp\") pod \"multus-csnv2\" (UID: \"d47b5659-b736-4e8d-abe4-3cee234ead85\") " pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.382426 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.382408 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nv6x\" (UniqueName: \"kubernetes.io/projected/2bd960cb-d43f-43af-84ee-6e0693fdb2da-kube-api-access-2nv6x\") pod \"iptables-alerter-g5hcg\" (UID: \"2bd960cb-d43f-43af-84ee-6e0693fdb2da\") " pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.382492 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.382432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mqt\" (UniqueName: \"kubernetes.io/projected/a34653fe-8931-4a96-adb4-518f9c93a246-kube-api-access-g9mqt\") pod \"multus-additional-cni-plugins-l64cf\" (UID: \"a34653fe-8931-4a96-adb4-518f9c93a246\") " pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.469434 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469406 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-device-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.469434 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469434 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469449 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-socket-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-registration-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6n7l\" (UniqueName: \"kubernetes.io/projected/7ac4e3b6-523e-4f41-98be-ceb879813ac3-kube-api-access-z6n7l\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469496 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469519 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469520 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469552 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-registration-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469568 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eeb650ea-cab6-4757-8b65-a0b656f23baf-agent-certs\") pod \"konnectivity-agent-6bv5v\" (UID: \"eeb650ea-cab6-4757-8b65-a0b656f23baf\") " pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:30.469589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469585 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-socket-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469617 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-sys-fs\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469644 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-device-dir\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469643 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/907a584d-065c-4c42-a7ff-3db1f2519bf9-serviceca\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzpkn\" (UniqueName: \"kubernetes.io/projected/a63cbdf9-50e4-4a41-84c1-7803058b65cb-kube-api-access-fzpkn\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469726 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a63cbdf9-50e4-4a41-84c1-7803058b65cb-sys-fs\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469737 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eeb650ea-cab6-4757-8b65-a0b656f23baf-konnectivity-ca\") pod \"konnectivity-agent-6bv5v\" (UID: \"eeb650ea-cab6-4757-8b65-a0b656f23baf\") " pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469789 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/907a584d-065c-4c42-a7ff-3db1f2519bf9-host\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469827 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6dx\" (UniqueName: \"kubernetes.io/projected/907a584d-065c-4c42-a7ff-3db1f2519bf9-kube-api-access-7n6dx\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.469869 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.469885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/907a584d-065c-4c42-a7ff-3db1f2519bf9-host\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.469955 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs podName:7ac4e3b6-523e-4f41-98be-ceb879813ac3 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:30.969922212 +0000 UTC m=+2.210275453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs") pod "network-metrics-daemon-7dwzn" (UID: "7ac4e3b6-523e-4f41-98be-ceb879813ac3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:30.470049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.470053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/907a584d-065c-4c42-a7ff-3db1f2519bf9-serviceca\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.470415 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.470279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eeb650ea-cab6-4757-8b65-a0b656f23baf-konnectivity-ca\") pod \"konnectivity-agent-6bv5v\" (UID: \"eeb650ea-cab6-4757-8b65-a0b656f23baf\") " pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:30.471737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.471716 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eeb650ea-cab6-4757-8b65-a0b656f23baf-agent-certs\") pod \"konnectivity-agent-6bv5v\" (UID: \"eeb650ea-cab6-4757-8b65-a0b656f23baf\") " pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:30.497876 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.497856 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:30.497876 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.497874 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:30.497987 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.497884 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fhtx4 for pod openshift-network-diagnostics/network-check-target-bg9mt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:30.497987 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.497945 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4 podName:562ff80c-46f9-46ea-bfc9-cacccd0662db nodeName:}" failed. No retries permitted until 2026-04-24 21:26:30.997932638 +0000 UTC m=+2.238285876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fhtx4" (UniqueName: "kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4") pod "network-check-target-bg9mt" (UID: "562ff80c-46f9-46ea-bfc9-cacccd0662db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:30.499306 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.499288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzpkn\" (UniqueName: \"kubernetes.io/projected/a63cbdf9-50e4-4a41-84c1-7803058b65cb-kube-api-access-fzpkn\") pod \"aws-ebs-csi-driver-node-76tm7\" (UID: \"a63cbdf9-50e4-4a41-84c1-7803058b65cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.500235 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.500216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6dx\" (UniqueName: \"kubernetes.io/projected/907a584d-065c-4c42-a7ff-3db1f2519bf9-kube-api-access-7n6dx\") pod \"node-ca-9pchk\" (UID: \"907a584d-065c-4c42-a7ff-3db1f2519bf9\") " pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.500405 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.500390 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6n7l\" (UniqueName: \"kubernetes.io/projected/7ac4e3b6-523e-4f41-98be-ceb879813ac3-kube-api-access-z6n7l\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:30.589976 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.589917 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-g5hcg" Apr 24 21:26:30.597605 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.597582 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bd960cb_d43f_43af_84ee_6e0693fdb2da.slice/crio-b25f87d46a596cbd2f8e639ff04789293c3a0dcd408e7595d4ef698729b4605c WatchSource:0}: Error finding container b25f87d46a596cbd2f8e639ff04789293c3a0dcd408e7595d4ef698729b4605c: Status 404 returned error can't find the container with id b25f87d46a596cbd2f8e639ff04789293c3a0dcd408e7595d4ef698729b4605c Apr 24 21:26:30.598228 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.598213 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kg22p" Apr 24 21:26:30.603754 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.603731 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod376860d0_150d_43d2_ba76_6f4bd2a03019.slice/crio-6bae518f683c37f10344787625dfd8e82750c087c463b9ecbae6eade2f3c2817 WatchSource:0}: Error finding container 6bae518f683c37f10344787625dfd8e82750c087c463b9ecbae6eade2f3c2817: Status 404 returned error can't find the container with id 6bae518f683c37f10344787625dfd8e82750c087c463b9ecbae6eade2f3c2817 Apr 24 21:26:30.607248 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.607233 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l64cf" Apr 24 21:26:30.611039 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.611023 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-csnv2" Apr 24 21:26:30.612932 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.612915 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda34653fe_8931_4a96_adb4_518f9c93a246.slice/crio-ce8520a7853f0e6784483a790833b7802acb4c6e1bb4a3f7409fb17c21b476bd WatchSource:0}: Error finding container ce8520a7853f0e6784483a790833b7802acb4c6e1bb4a3f7409fb17c21b476bd: Status 404 returned error can't find the container with id ce8520a7853f0e6784483a790833b7802acb4c6e1bb4a3f7409fb17c21b476bd Apr 24 21:26:30.616393 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.616305 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:30.618075 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.618049 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd47b5659_b736_4e8d_abe4_3cee234ead85.slice/crio-30579144dd4708bb96e7e61eef412552ba7528064d5948c0a207ed59c1bc6548 WatchSource:0}: Error finding container 30579144dd4708bb96e7e61eef412552ba7528064d5948c0a207ed59c1bc6548: Status 404 returned error can't find the container with id 30579144dd4708bb96e7e61eef412552ba7528064d5948c0a207ed59c1bc6548 Apr 24 21:26:30.618907 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.618876 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:30.624010 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.623989 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:30.626859 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.626828 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf94d99f1_b1b1_4885_b23c_789c312e3426.slice/crio-41634379ffc8b9ccb3af967b0011f05ead62378faa38c966394116cadb671368 WatchSource:0}: Error finding container 41634379ffc8b9ccb3af967b0011f05ead62378faa38c966394116cadb671368: Status 404 returned error can't find the container with id 41634379ffc8b9ccb3af967b0011f05ead62378faa38c966394116cadb671368 Apr 24 21:26:30.628529 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.628508 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" Apr 24 21:26:30.631718 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.631694 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb650ea_cab6_4757_8b65_a0b656f23baf.slice/crio-c41fb0a665ab39edc4456a8e426b72a29dbb08959b70e537ae301ce789c81100 WatchSource:0}: Error finding container c41fb0a665ab39edc4456a8e426b72a29dbb08959b70e537ae301ce789c81100: Status 404 returned error can't find the container with id c41fb0a665ab39edc4456a8e426b72a29dbb08959b70e537ae301ce789c81100 Apr 24 21:26:30.633250 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.633099 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9pchk" Apr 24 21:26:30.638513 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.638353 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63cbdf9_50e4_4a41_84c1_7803058b65cb.slice/crio-d483bd415066c60386e55c714b37274f8beb20b850cb301f5ced0a0710beaa2e WatchSource:0}: Error finding container d483bd415066c60386e55c714b37274f8beb20b850cb301f5ced0a0710beaa2e: Status 404 returned error can't find the container with id d483bd415066c60386e55c714b37274f8beb20b850cb301f5ced0a0710beaa2e Apr 24 21:26:30.641877 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:30.641840 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod907a584d_065c_4c42_a7ff_3db1f2519bf9.slice/crio-dfaac9077fa1aa084f16e58bc5f7b85a664525870454698ffed787d1eb477fd4 WatchSource:0}: Error finding container dfaac9077fa1aa084f16e58bc5f7b85a664525870454698ffed787d1eb477fd4: Status 404 returned error can't find the container with id dfaac9077fa1aa084f16e58bc5f7b85a664525870454698ffed787d1eb477fd4 Apr 24 21:26:30.973833 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:30.973753 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:30.973980 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.973877 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:30.973980 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:30.973941 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs podName:7ac4e3b6-523e-4f41-98be-ceb879813ac3 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:31.973921901 +0000 UTC m=+3.214275141 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs") pod "network-metrics-daemon-7dwzn" (UID: "7ac4e3b6-523e-4f41-98be-ceb879813ac3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:31.074261 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.074222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:31.074446 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:31.074388 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:31.074446 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:31.074406 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:31.074446 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:31.074418 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fhtx4 for pod openshift-network-diagnostics/network-check-target-bg9mt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:31.074592 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:31.074473 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4 podName:562ff80c-46f9-46ea-bfc9-cacccd0662db nodeName:}" failed. No retries permitted until 2026-04-24 21:26:32.074453853 +0000 UTC m=+3.314807103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fhtx4" (UniqueName: "kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4") pod "network-check-target-bg9mt" (UID: "562ff80c-46f9-46ea-bfc9-cacccd0662db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:31.321887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.321753 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:30 +0000 UTC" deadline="2027-10-25 18:47:49.533460434 +0000 UTC" Apr 24 21:26:31.321887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.321814 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13173h21m18.211651251s" Apr 24 21:26:31.329335 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.328830 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:31.329335 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:31.328979 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:31.379956 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.379734 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" event={"ID":"a63cbdf9-50e4-4a41-84c1-7803058b65cb","Type":"ContainerStarted","Data":"d483bd415066c60386e55c714b37274f8beb20b850cb301f5ced0a0710beaa2e"} Apr 24 21:26:31.387121 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.387053 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6bv5v" event={"ID":"eeb650ea-cab6-4757-8b65-a0b656f23baf","Type":"ContainerStarted","Data":"c41fb0a665ab39edc4456a8e426b72a29dbb08959b70e537ae301ce789c81100"} Apr 24 21:26:31.392836 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.392656 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"41634379ffc8b9ccb3af967b0011f05ead62378faa38c966394116cadb671368"} Apr 24 21:26:31.396338 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.396276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerStarted","Data":"ce8520a7853f0e6784483a790833b7802acb4c6e1bb4a3f7409fb17c21b476bd"} Apr 24 21:26:31.402357 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.402297 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9pchk" event={"ID":"907a584d-065c-4c42-a7ff-3db1f2519bf9","Type":"ContainerStarted","Data":"dfaac9077fa1aa084f16e58bc5f7b85a664525870454698ffed787d1eb477fd4"} Apr 24 21:26:31.405452 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.405405 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csnv2" event={"ID":"d47b5659-b736-4e8d-abe4-3cee234ead85","Type":"ContainerStarted","Data":"30579144dd4708bb96e7e61eef412552ba7528064d5948c0a207ed59c1bc6548"} Apr 24 21:26:31.409888 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.409835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kg22p" event={"ID":"376860d0-150d-43d2-ba76-6f4bd2a03019","Type":"ContainerStarted","Data":"6bae518f683c37f10344787625dfd8e82750c087c463b9ecbae6eade2f3c2817"} Apr 24 21:26:31.419324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.419275 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-g5hcg" event={"ID":"2bd960cb-d43f-43af-84ee-6e0693fdb2da","Type":"ContainerStarted","Data":"b25f87d46a596cbd2f8e639ff04789293c3a0dcd408e7595d4ef698729b4605c"} Apr 24 21:26:31.580863 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.580787 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:31.677809 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.677691 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:31.981428 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:31.981346 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:31.981590 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:31.981506 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:31.981641 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:31.981618 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs podName:7ac4e3b6-523e-4f41-98be-ceb879813ac3 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:33.981551572 +0000 UTC m=+5.221904812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs") pod "network-metrics-daemon-7dwzn" (UID: "7ac4e3b6-523e-4f41-98be-ceb879813ac3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:32.082577 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:32.082544 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:32.082774 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:32.082751 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:32.082839 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:32.082779 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:32.082839 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:32.082793 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fhtx4 for pod openshift-network-diagnostics/network-check-target-bg9mt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:32.082933 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:32.082874 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4 podName:562ff80c-46f9-46ea-bfc9-cacccd0662db nodeName:}" failed. No retries permitted until 2026-04-24 21:26:34.082853998 +0000 UTC m=+5.323207242 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fhtx4" (UniqueName: "kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4") pod "network-check-target-bg9mt" (UID: "562ff80c-46f9-46ea-bfc9-cacccd0662db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:32.322782 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:32.322738 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:30 +0000 UTC" deadline="2028-01-02 23:49:50.42160944 +0000 UTC" Apr 24 21:26:32.322782 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:32.322781 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14834h23m18.098832898s" Apr 24 21:26:32.327418 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:32.327393 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:32.327533 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:32.327509 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:32.679125 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:32.679048 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:33.327861 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:33.327812 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:33.328306 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:33.327946 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:33.999901 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:33.999856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:34.000089 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:34.000026 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:34.000147 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:34.000094 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs podName:7ac4e3b6-523e-4f41-98be-ceb879813ac3 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:38.000074457 +0000 UTC m=+9.240427722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs") pod "network-metrics-daemon-7dwzn" (UID: "7ac4e3b6-523e-4f41-98be-ceb879813ac3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:34.101329 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:34.101288 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:34.101503 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:34.101428 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:34.101503 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:34.101445 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:34.101503 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:34.101453 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fhtx4 for pod openshift-network-diagnostics/network-check-target-bg9mt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:34.101503 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:34.101496 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4 podName:562ff80c-46f9-46ea-bfc9-cacccd0662db nodeName:}" failed. No retries permitted until 2026-04-24 21:26:38.10148268 +0000 UTC m=+9.341835922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fhtx4" (UniqueName: "kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4") pod "network-check-target-bg9mt" (UID: "562ff80c-46f9-46ea-bfc9-cacccd0662db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:34.327697 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:34.327590 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:34.327834 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:34.327740 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:35.327989 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:35.327957 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:35.328461 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:35.328099 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:36.328262 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:36.327870 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:36.328262 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:36.327978 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:37.329267 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:37.329235 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:37.329623 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:37.329416 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:38.036427 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:38.036385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:38.036604 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:38.036528 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:38.036604 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:38.036590 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs podName:7ac4e3b6-523e-4f41-98be-ceb879813ac3 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:46.036572441 +0000 UTC m=+17.276925682 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs") pod "network-metrics-daemon-7dwzn" (UID: "7ac4e3b6-523e-4f41-98be-ceb879813ac3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:38.136840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:38.136806 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:38.137004 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:38.136975 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:38.137004 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:38.136995 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:38.137114 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:38.137010 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fhtx4 for pod openshift-network-diagnostics/network-check-target-bg9mt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:38.137114 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:38.137070 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4 podName:562ff80c-46f9-46ea-bfc9-cacccd0662db nodeName:}" failed. No retries permitted until 2026-04-24 21:26:46.137050929 +0000 UTC m=+17.377404170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fhtx4" (UniqueName: "kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4") pod "network-check-target-bg9mt" (UID: "562ff80c-46f9-46ea-bfc9-cacccd0662db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:38.328158 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:38.328072 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:38.328311 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:38.328200 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:39.328310 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:39.328282 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:39.328768 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:39.328379 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:40.327645 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:40.327614 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:40.327847 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:40.327738 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:41.328097 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:41.328054 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:41.328483 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:41.328183 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:42.328418 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:42.328382 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:42.328881 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:42.328490 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:43.327544 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:43.327496 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:43.327759 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:43.327638 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:44.327900 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:44.327820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:44.328312 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:44.327935 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:45.327917 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:45.327877 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:45.328380 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:45.328036 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:46.091796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:46.091752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:46.092021 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:46.091910 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:46.092021 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:46.091986 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs podName:7ac4e3b6-523e-4f41-98be-ceb879813ac3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:02.09197046 +0000 UTC m=+33.332323698 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs") pod "network-metrics-daemon-7dwzn" (UID: "7ac4e3b6-523e-4f41-98be-ceb879813ac3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:46.192160 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:46.192104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:46.192344 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:46.192277 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:46.192344 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:46.192293 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:46.192344 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:46.192303 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fhtx4 for pod openshift-network-diagnostics/network-check-target-bg9mt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:46.192494 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:46.192346 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4 podName:562ff80c-46f9-46ea-bfc9-cacccd0662db nodeName:}" failed. No retries permitted until 2026-04-24 21:27:02.192334241 +0000 UTC m=+33.432687479 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fhtx4" (UniqueName: "kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4") pod "network-check-target-bg9mt" (UID: "562ff80c-46f9-46ea-bfc9-cacccd0662db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:46.327459 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:46.327419 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:46.327623 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:46.327539 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:47.327836 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:47.327797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:47.328244 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:47.327922 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:48.327835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:48.327814 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:48.327948 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:48.327930 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:49.329049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.328658 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:49.329803 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:49.329067 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:49.453994 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.453966 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:26:49.454271 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.454248 2569 generic.go:358] "Generic (PLEG): container finished" podID="f94d99f1-b1b1-4885-b23c-789c312e3426" containerID="bf07968e02e17d3d483d56c59eb4173debd99c9cc2a44f168fb874a1af36be59" exitCode=1 Apr 24 21:26:49.454342 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.454300 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"eb82a644bcb786ee489c317439f0ce53e3ed495cc84911d7444280b921c5c268"} Apr 24 21:26:49.454342 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.454327 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"704ae6341f23a55858dc6b705cf3458dd9594308123b5582fb82987272dfd3eb"} Apr 24 21:26:49.454342 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.454336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"db676362915eac59226943b45645719d99536db35090889bfce9d39014b04fb8"} Apr 24 21:26:49.454490 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.454345 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"b3ad033fd4d6a212d069e19a5f79bfcadae2e7d7f7e523470ad166b30c8b3e6a"} Apr 24 21:26:49.454490 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.454353 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerDied","Data":"bf07968e02e17d3d483d56c59eb4173debd99c9cc2a44f168fb874a1af36be59"} Apr 24 21:26:49.454490 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.454363 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"efba4d7b0e461f21ebd6def078635693bf8ccd4b09d3d11c5ccff428be8e5363"} Apr 24 21:26:49.455566 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.455545 2569 generic.go:358] "Generic (PLEG): container finished" podID="a34653fe-8931-4a96-adb4-518f9c93a246" containerID="9ae62e0faaa8ee1de4db76162625df8850db21fc1b25bc6ca1e920726d958494" exitCode=0 Apr 24 21:26:49.455695 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.455605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerDied","Data":"9ae62e0faaa8ee1de4db76162625df8850db21fc1b25bc6ca1e920726d958494"} Apr 24 21:26:49.456863 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.456838 2569 generic.go:358] "Generic (PLEG): container finished" podID="48fe24b0efea27e3f5eca2d71913b3e7" containerID="27fa7d3f3c639d11ee921af959812162c934946f19f88d26bb0c0475153071d6" exitCode=0 Apr 24 21:26:49.456955 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.456866 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" event={"ID":"48fe24b0efea27e3f5eca2d71913b3e7","Type":"ContainerDied","Data":"27fa7d3f3c639d11ee921af959812162c934946f19f88d26bb0c0475153071d6"} Apr 24 21:26:49.460512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.460485 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9pchk" event={"ID":"907a584d-065c-4c42-a7ff-3db1f2519bf9","Type":"ContainerStarted","Data":"34683213854de83a45db5e08050d6d0dce071a9ec6257016edad654e9cc23ca3"} Apr 24 21:26:49.461711 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.461687 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csnv2" event={"ID":"d47b5659-b736-4e8d-abe4-3cee234ead85","Type":"ContainerStarted","Data":"8ea569c0feed111df1a5f5c8517d24f67dbcb1f378be24e13696069f0d0ebc3c"} Apr 24 21:26:49.462797 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.462736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kg22p" event={"ID":"376860d0-150d-43d2-ba76-6f4bd2a03019","Type":"ContainerStarted","Data":"45df90e347e690879693f844eeaebdafad81532a18419472de295fbb447b6a9f"} Apr 24 21:26:49.463962 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.463939 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" event={"ID":"d74e5b7f3ca859862ed6413284694748","Type":"ContainerStarted","Data":"ce5ad2bd894a4a0c9f05acb45abcc37e3e667dd2e398937ca945c7348ff08313"} Apr 24 21:26:49.465206 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.465184 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" event={"ID":"a63cbdf9-50e4-4a41-84c1-7803058b65cb","Type":"ContainerStarted","Data":"1665c1f8e648a37b2cc592f20d4c51ab04ee5f7e717db6b6d641037b9e221234"} Apr 24 21:26:49.466246 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.466229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6bv5v" event={"ID":"eeb650ea-cab6-4757-8b65-a0b656f23baf","Type":"ContainerStarted","Data":"2a7f75f32c02683bdbd0a547bf7d235e5cd061b0b623365b86a2cf40e7029c89"} Apr 24 21:26:49.500493 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.500428 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-73.ec2.internal" podStartSLOduration=19.500408795 podStartE2EDuration="19.500408795s" podCreationTimestamp="2026-04-24 21:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:49.50023178 +0000 UTC m=+20.740585039" watchObservedRunningTime="2026-04-24 21:26:49.500408795 +0000 UTC m=+20.740762058" Apr 24 21:26:49.520523 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.520449 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kg22p" podStartSLOduration=2.834126708 podStartE2EDuration="20.520429818s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.604877402 +0000 UTC m=+1.845230640" lastFinishedPulling="2026-04-24 21:26:48.291180511 +0000 UTC m=+19.531533750" observedRunningTime="2026-04-24 21:26:49.520080271 +0000 UTC m=+20.760433529" watchObservedRunningTime="2026-04-24 21:26:49.520429818 +0000 UTC m=+20.760783079" Apr 24 21:26:49.545387 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.545341 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-csnv2" podStartSLOduration=2.569203201 podStartE2EDuration="20.545327688s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.620441119 +0000 UTC m=+1.860794371" lastFinishedPulling="2026-04-24 21:26:48.596565618 +0000 UTC m=+19.836918858" observedRunningTime="2026-04-24 21:26:49.545025611 +0000 UTC m=+20.785378872" watchObservedRunningTime="2026-04-24 21:26:49.545327688 +0000 UTC m=+20.785680948" Apr 24 21:26:49.578625 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.578578 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6bv5v" podStartSLOduration=6.72165903 podStartE2EDuration="20.578562213s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.63361031 +0000 UTC m=+1.873963562" lastFinishedPulling="2026-04-24 21:26:44.490513496 +0000 UTC m=+15.730866745" observedRunningTime="2026-04-24 21:26:49.562268788 +0000 UTC m=+20.802622047" watchObservedRunningTime="2026-04-24 21:26:49.578562213 +0000 UTC m=+20.818915454" Apr 24 21:26:49.599508 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:49.599402 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9pchk" podStartSLOduration=3.011658647 podStartE2EDuration="20.599386833s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.643858074 +0000 UTC m=+1.884211312" lastFinishedPulling="2026-04-24 21:26:48.231586243 +0000 UTC m=+19.471939498" observedRunningTime="2026-04-24 21:26:49.598919328 +0000 UTC m=+20.839272586" watchObservedRunningTime="2026-04-24 21:26:49.599386833 +0000 UTC m=+20.839740093" Apr 24 21:26:50.093660 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.093638 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:26:50.328069 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.328037 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:50.328247 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:50.328149 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:50.330156 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.330073 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:26:50.093656068Z","UUID":"b1f240a9-c3fc-46bc-ad83-2d51b290781f","Handler":null,"Name":"","Endpoint":""} Apr 24 21:26:50.334113 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.334090 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:26:50.334238 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.334122 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:26:50.469941 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.469903 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" event={"ID":"a63cbdf9-50e4-4a41-84c1-7803058b65cb","Type":"ContainerStarted","Data":"3899caec22553da50ce8cd55e9b8d241e286ece63333579be8b84b2df5824150"} Apr 24 21:26:50.475471 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.471900 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" event={"ID":"48fe24b0efea27e3f5eca2d71913b3e7","Type":"ContainerStarted","Data":"ddb45eb81b6ac1ea015255ca5b845aed181702596d8fd30fcbf53d95b47c718f"} Apr 24 21:26:50.475624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.475496 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-g5hcg" event={"ID":"2bd960cb-d43f-43af-84ee-6e0693fdb2da","Type":"ContainerStarted","Data":"1e805d45c028f8f0bae5843cb097ba9aa9eb6e2035ca73f10bf27f9bf4dd7d9e"} Apr 24 21:26:50.491590 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.491543 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-73.ec2.internal" podStartSLOduration=20.491530504 podStartE2EDuration="20.491530504s" podCreationTimestamp="2026-04-24 21:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:50.491472161 +0000 UTC m=+21.731825423" watchObservedRunningTime="2026-04-24 21:26:50.491530504 +0000 UTC m=+21.731883764" Apr 24 21:26:50.506855 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:50.506794 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-g5hcg" podStartSLOduration=3.816105072 podStartE2EDuration="21.506776294s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.598972742 +0000 UTC m=+1.839325979" lastFinishedPulling="2026-04-24 21:26:48.289643957 +0000 UTC m=+19.529997201" observedRunningTime="2026-04-24 21:26:50.506457126 +0000 UTC m=+21.746810410" watchObservedRunningTime="2026-04-24 21:26:50.506776294 +0000 UTC m=+21.747129553" Apr 24 21:26:51.331944 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:51.331920 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:51.332367 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:51.332035 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:51.478094 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:51.478051 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" event={"ID":"a63cbdf9-50e4-4a41-84c1-7803058b65cb","Type":"ContainerStarted","Data":"463c374cdbed78c04b584faa3b168cdadc7db9699f5fb87e8219e28b9fe3fad2"} Apr 24 21:26:51.481574 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:51.481552 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:26:51.482048 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:51.482021 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"c19356219a1160d6866461c29f6d289ddf07c8147c2eda5f12bd75757a65fc4b"} Apr 24 21:26:51.503611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:51.503555 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-76tm7" podStartSLOduration=2.239175751 podStartE2EDuration="22.503537212s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.640784396 +0000 UTC m=+1.881137634" lastFinishedPulling="2026-04-24 21:26:50.905145849 +0000 UTC m=+22.145499095" observedRunningTime="2026-04-24 21:26:51.503380407 +0000 UTC m=+22.743733692" watchObservedRunningTime="2026-04-24 21:26:51.503537212 +0000 UTC m=+22.743890471" Apr 24 21:26:52.328253 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:52.328078 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:52.328442 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:52.328339 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:53.328393 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:53.328328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:53.328852 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:53.328482 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:53.936900 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:53.936862 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:53.937441 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:53.937422 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:54.072405 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.072193 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hkcmr"] Apr 24 21:26:54.075678 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.075652 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.078190 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.078169 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8bhxp\"" Apr 24 21:26:54.078408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.078393 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:26:54.078659 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.078647 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:26:54.144051 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.144013 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-hosts-file\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.144188 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.144063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-tmp-dir\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.144188 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.144156 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpf6p\" (UniqueName: \"kubernetes.io/projected/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-kube-api-access-tpf6p\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.244407 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.244378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpf6p\" (UniqueName: \"kubernetes.io/projected/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-kube-api-access-tpf6p\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.244524 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.244410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-hosts-file\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.244524 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.244453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-tmp-dir\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.244616 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.244523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-hosts-file\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.244747 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.244730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-tmp-dir\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.253348 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.253317 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpf6p\" (UniqueName: \"kubernetes.io/projected/3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8-kube-api-access-tpf6p\") pod \"node-resolver-hkcmr\" (UID: \"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8\") " pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.328029 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.327933 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:54.328170 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:54.328044 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:54.383412 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.383369 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hkcmr" Apr 24 21:26:54.390212 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:26:54.390186 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a5a0d8d_bcc0_4444_bbd8_a88bae8670d8.slice/crio-858c710b72354b31be2c7f47e36cb6a27a241701dd72b7591f1587379958ebe2 WatchSource:0}: Error finding container 858c710b72354b31be2c7f47e36cb6a27a241701dd72b7591f1587379958ebe2: Status 404 returned error can't find the container with id 858c710b72354b31be2c7f47e36cb6a27a241701dd72b7591f1587379958ebe2 Apr 24 21:26:54.491766 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.491741 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:26:54.492102 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.492076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"bb034cbe1609da4a79d2b99ce5a7b3e0a4f04b5b30df6d805b1b83af2d319e86"} Apr 24 21:26:54.492423 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.492407 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:54.492582 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.492567 2569 scope.go:117] "RemoveContainer" containerID="bf07968e02e17d3d483d56c59eb4173debd99c9cc2a44f168fb874a1af36be59" Apr 24 21:26:54.494098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.494074 2569 generic.go:358] "Generic (PLEG): container finished" podID="a34653fe-8931-4a96-adb4-518f9c93a246" containerID="93b1631b1eb5717fa4b5fcd4f2cb2428e07a74d50d51b04a79186cc763296482" exitCode=0 Apr 24 21:26:54.494213 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.494152 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerDied","Data":"93b1631b1eb5717fa4b5fcd4f2cb2428e07a74d50d51b04a79186cc763296482"} Apr 24 21:26:54.495865 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.495845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hkcmr" event={"ID":"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8","Type":"ContainerStarted","Data":"858c710b72354b31be2c7f47e36cb6a27a241701dd72b7591f1587379958ebe2"} Apr 24 21:26:54.496128 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.496113 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:54.496788 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.496768 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6bv5v" Apr 24 21:26:54.508696 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:54.508650 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:55.328093 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.328057 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:55.328253 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:55.328216 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:55.499942 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.499912 2569 generic.go:358] "Generic (PLEG): container finished" podID="a34653fe-8931-4a96-adb4-518f9c93a246" containerID="008b9d687ca69a80d6401ff98d3dee86004e83b4a30ac40de767ab8d0fde78ad" exitCode=0 Apr 24 21:26:55.500357 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.499977 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerDied","Data":"008b9d687ca69a80d6401ff98d3dee86004e83b4a30ac40de767ab8d0fde78ad"} Apr 24 21:26:55.501499 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.501473 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hkcmr" event={"ID":"3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8","Type":"ContainerStarted","Data":"82bd66844a7f71eac9783105e5e2ede6d9eaab83deb2d003992c8c5bd224e923"} Apr 24 21:26:55.504785 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.504769 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:26:55.505162 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.505140 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" event={"ID":"f94d99f1-b1b1-4885-b23c-789c312e3426","Type":"ContainerStarted","Data":"c709b9fda372ddf6e839f3f2f6df875da84c4961fc314f7f6b7206212d386a1f"} Apr 24 21:26:55.505334 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.505315 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:55.505422 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.505346 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:55.520891 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.520867 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:26:55.592410 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.592352 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" podStartSLOduration=8.925142933 podStartE2EDuration="26.592338067s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.628732052 +0000 UTC m=+1.869085304" lastFinishedPulling="2026-04-24 21:26:48.295927199 +0000 UTC m=+19.536280438" observedRunningTime="2026-04-24 21:26:55.564952226 +0000 UTC m=+26.805305486" watchObservedRunningTime="2026-04-24 21:26:55.592338067 +0000 UTC m=+26.832691327" Apr 24 21:26:55.592631 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.592561 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hkcmr" podStartSLOduration=1.5925525230000002 podStartE2EDuration="1.592552523s" podCreationTimestamp="2026-04-24 21:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:55.591760124 +0000 UTC m=+26.832113383" watchObservedRunningTime="2026-04-24 21:26:55.592552523 +0000 UTC m=+26.832905794" Apr 24 21:26:55.632461 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.632432 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7dwzn"] Apr 24 21:26:55.632592 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.632536 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:55.632688 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:55.632643 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:55.635862 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.635840 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bg9mt"] Apr 24 21:26:55.635973 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:55.635933 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:55.636030 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:55.636001 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:56.509347 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:56.509313 2569 generic.go:358] "Generic (PLEG): container finished" podID="a34653fe-8931-4a96-adb4-518f9c93a246" containerID="d1ee7e0ff552461b4418d9df38730c9e8bf6d9b6b7b9ff2a67b57596f637323e" exitCode=0 Apr 24 21:26:56.509858 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:56.509438 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerDied","Data":"d1ee7e0ff552461b4418d9df38730c9e8bf6d9b6b7b9ff2a67b57596f637323e"} Apr 24 21:26:57.023271 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.023238 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9zcwj"] Apr 24 21:26:57.026057 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.026028 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.026171 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:57.026121 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9zcwj" podUID="8179f9fe-0d63-49d2-84df-a9763b98a8c6" Apr 24 21:26:57.042821 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.042790 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9zcwj"] Apr 24 21:26:57.064423 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.064391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.064423 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.064427 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8179f9fe-0d63-49d2-84df-a9763b98a8c6-dbus\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.064603 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.064456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8179f9fe-0d63-49d2-84df-a9763b98a8c6-kubelet-config\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.165787 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.165748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.165952 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.165796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8179f9fe-0d63-49d2-84df-a9763b98a8c6-dbus\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.165952 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.165829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8179f9fe-0d63-49d2-84df-a9763b98a8c6-kubelet-config\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.165952 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.165936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8179f9fe-0d63-49d2-84df-a9763b98a8c6-kubelet-config\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.165952 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:57.165938 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:57.166188 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:57.166016 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret podName:8179f9fe-0d63-49d2-84df-a9763b98a8c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:57.665996657 +0000 UTC m=+28.906349898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret") pod "global-pull-secret-syncer-9zcwj" (UID: "8179f9fe-0d63-49d2-84df-a9763b98a8c6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:57.166188 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.166173 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8179f9fe-0d63-49d2-84df-a9763b98a8c6-dbus\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.330864 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.330786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:57.330864 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.330815 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:57.331069 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:57.330913 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:57.331069 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:57.331045 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:26:57.511233 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.511204 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.511583 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:57.511334 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9zcwj" podUID="8179f9fe-0d63-49d2-84df-a9763b98a8c6" Apr 24 21:26:57.669533 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:57.669443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:57.669694 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:57.669600 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:57.669751 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:57.669697 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret podName:8179f9fe-0d63-49d2-84df-a9763b98a8c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:58.669659252 +0000 UTC m=+29.910012503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret") pod "global-pull-secret-syncer-9zcwj" (UID: "8179f9fe-0d63-49d2-84df-a9763b98a8c6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:58.677066 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:58.677025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:58.677663 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:58.677198 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:58.677663 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:58.677306 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret podName:8179f9fe-0d63-49d2-84df-a9763b98a8c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.677286062 +0000 UTC m=+31.917639316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret") pod "global-pull-secret-syncer-9zcwj" (UID: "8179f9fe-0d63-49d2-84df-a9763b98a8c6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:59.328629 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:59.328588 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:26:59.328832 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:59.328716 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9zcwj" podUID="8179f9fe-0d63-49d2-84df-a9763b98a8c6" Apr 24 21:26:59.328832 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:59.328764 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:26:59.328959 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:59.328874 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7dwzn" podUID="7ac4e3b6-523e-4f41-98be-ceb879813ac3" Apr 24 21:26:59.328959 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:26:59.328893 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:26:59.329052 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:26:59.328991 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bg9mt" podUID="562ff80c-46f9-46ea-bfc9-cacccd0662db" Apr 24 21:27:00.120571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.120543 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-73.ec2.internal" event="NodeReady" Apr 24 21:27:00.121009 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.120710 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:00.171048 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.171014 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-747cff9957-t7znk"] Apr 24 21:27:00.192603 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.192571 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h6bcf"] Apr 24 21:27:00.192783 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.192761 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.196365 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.195654 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:27:00.196365 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.195936 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:27:00.196365 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.196191 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:27:00.196611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.196541 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qbk7c\"" Apr 24 21:27:00.203612 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.203586 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:27:00.222457 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.222431 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4ph6t"] Apr 24 21:27:00.222621 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.222604 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.226877 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.226855 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:00.226991 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.226919 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfvbj\"" Apr 24 21:27:00.228034 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.228014 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:00.245291 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.245193 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-747cff9957-t7znk"] Apr 24 21:27:00.245291 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.245227 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h6bcf"] Apr 24 21:27:00.245291 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.245240 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4ph6t"] Apr 24 21:27:00.245509 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.245352 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:00.249083 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.249060 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:00.249083 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.249060 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:00.249260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.249195 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fgjf8\"" Apr 24 21:27:00.249375 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.249354 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:00.288233 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288192 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lxvn\" (UniqueName: \"kubernetes.io/projected/867d54ac-7685-453e-ab98-671a28e06ea0-kube-api-access-6lxvn\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:00.288400 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-ca-trust-extracted\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.288400 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:00.288400 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-trusted-ca\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.288400 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288340 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.288585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288407 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-installation-pull-secrets\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.288585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288494 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-image-registry-private-configuration\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.288585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288546 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.288585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288569 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-certificates\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.288755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-bound-sa-token\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.288755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7tw\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-kube-api-access-5m7tw\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.288755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74d17f36-1527-485c-ad29-abd7e0f42a70-tmp-dir\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.288755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288685 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6lqq\" (UniqueName: \"kubernetes.io/projected/74d17f36-1527-485c-ad29-abd7e0f42a70-kube-api-access-t6lqq\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.288755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.288719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74d17f36-1527-485c-ad29-abd7e0f42a70-config-volume\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.389684 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.389634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-image-registry-private-configuration\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.389848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.389718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.389848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.389744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-certificates\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.389848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.389765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-bound-sa-token\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.389848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.389780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7tw\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-kube-api-access-5m7tw\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.389848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.389803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74d17f36-1527-485c-ad29-abd7e0f42a70-tmp-dir\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.390096 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.389896 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:00.390096 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.389919 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747cff9957-t7znk: secret "image-registry-tls" not found Apr 24 21:27:00.390096 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.389978 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls podName:20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.889956594 +0000 UTC m=+32.130309834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls") pod "image-registry-747cff9957-t7znk" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d") : secret "image-registry-tls" not found Apr 24 21:27:00.390096 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6lqq\" (UniqueName: \"kubernetes.io/projected/74d17f36-1527-485c-ad29-abd7e0f42a70-kube-api-access-t6lqq\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.390096 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390064 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74d17f36-1527-485c-ad29-abd7e0f42a70-config-volume\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.390315 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lxvn\" (UniqueName: \"kubernetes.io/projected/867d54ac-7685-453e-ab98-671a28e06ea0-kube-api-access-6lxvn\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:00.390315 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-ca-trust-extracted\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.390315 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390194 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:00.390315 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-trusted-ca\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.390315 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.390315 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390114 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74d17f36-1527-485c-ad29-abd7e0f42a70-tmp-dir\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.390315 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-installation-pull-secrets\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.390614 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.390440 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:00.390614 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.390491 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert podName:867d54ac-7685-453e-ab98-671a28e06ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.890474485 +0000 UTC m=+32.130827725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert") pod "ingress-canary-4ph6t" (UID: "867d54ac-7685-453e-ab98-671a28e06ea0") : secret "canary-serving-cert" not found Apr 24 21:27:00.390614 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.390553 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:00.390614 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.390586 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls podName:74d17f36-1527-485c-ad29-abd7e0f42a70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.890575507 +0000 UTC m=+32.130928745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls") pod "dns-default-h6bcf" (UID: "74d17f36-1527-485c-ad29-abd7e0f42a70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:00.390768 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.390627 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74d17f36-1527-485c-ad29-abd7e0f42a70-config-volume\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.394211 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.394183 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-installation-pull-secrets\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.394211 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.394207 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-image-registry-private-configuration\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.400348 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.400320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-ca-trust-extracted\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.400472 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.400435 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-certificates\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.400819 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.400799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-trusted-ca\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.404611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.404583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6lqq\" (UniqueName: \"kubernetes.io/projected/74d17f36-1527-485c-ad29-abd7e0f42a70-kube-api-access-t6lqq\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.404712 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.404655 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7tw\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-kube-api-access-5m7tw\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.404927 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.404908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-bound-sa-token\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.414485 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.414462 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lxvn\" (UniqueName: \"kubernetes.io/projected/867d54ac-7685-453e-ab98-671a28e06ea0-kube-api-access-6lxvn\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:00.692844 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.692803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:27:00.693039 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.692938 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:00.693039 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.693033 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret podName:8179f9fe-0d63-49d2-84df-a9763b98a8c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:04.693003527 +0000 UTC m=+35.933356792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret") pod "global-pull-secret-syncer-9zcwj" (UID: "8179f9fe-0d63-49d2-84df-a9763b98a8c6") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:00.894597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.894561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:00.894785 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.894608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:00.894785 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:00.894664 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:00.894785 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.894758 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:00.894785 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.894773 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:00.894785 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.894783 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:00.895071 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.894797 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747cff9957-t7znk: secret "image-registry-tls" not found Apr 24 21:27:00.895071 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.894826 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert podName:867d54ac-7685-453e-ab98-671a28e06ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.894810812 +0000 UTC m=+33.135164050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert") pod "ingress-canary-4ph6t" (UID: "867d54ac-7685-453e-ab98-671a28e06ea0") : secret "canary-serving-cert" not found Apr 24 21:27:00.895071 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.894847 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls podName:74d17f36-1527-485c-ad29-abd7e0f42a70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.894838309 +0000 UTC m=+33.135191549 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls") pod "dns-default-h6bcf" (UID: "74d17f36-1527-485c-ad29-abd7e0f42a70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:00.895071 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:00.894862 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls podName:20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.894853801 +0000 UTC m=+33.135207053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls") pod "image-registry-747cff9957-t7znk" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d") : secret "image-registry-tls" not found Apr 24 21:27:01.328027 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.327954 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:27:01.328027 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.327985 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:27:01.328027 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.328001 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:27:01.332561 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.332535 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:01.332714 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.332578 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:01.332714 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.332608 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:01.332714 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.332535 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vd7zr\"" Apr 24 21:27:01.332714 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.332705 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:27:01.332922 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.332830 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s9lwz\"" Apr 24 21:27:01.902565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.902531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:01.902775 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.902580 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:01.903017 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:01.902993 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:01.903017 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:01.903009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:01.903131 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:01.903087 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert podName:867d54ac-7685-453e-ab98-671a28e06ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.903054455 +0000 UTC m=+35.143407696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert") pod "ingress-canary-4ph6t" (UID: "867d54ac-7685-453e-ab98-671a28e06ea0") : secret "canary-serving-cert" not found Apr 24 21:27:01.903131 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:01.903127 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:01.903226 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:01.903138 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747cff9957-t7znk: secret "image-registry-tls" not found Apr 24 21:27:01.903226 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:01.903179 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls podName:20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.903162983 +0000 UTC m=+35.143516221 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls") pod "image-registry-747cff9957-t7znk" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d") : secret "image-registry-tls" not found Apr 24 21:27:01.903226 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:01.903221 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:01.903360 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:01.903247 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls podName:74d17f36-1527-485c-ad29-abd7e0f42a70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.903236802 +0000 UTC m=+35.143590040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls") pod "dns-default-h6bcf" (UID: "74d17f36-1527-485c-ad29-abd7e0f42a70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:02.104230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:02.104186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:27:02.104424 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:02.104316 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:02.104424 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:02.104386 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs podName:7ac4e3b6-523e-4f41-98be-ceb879813ac3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:34.104366411 +0000 UTC m=+65.344719666 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs") pod "network-metrics-daemon-7dwzn" (UID: "7ac4e3b6-523e-4f41-98be-ceb879813ac3") : secret "metrics-daemon-secret" not found Apr 24 21:27:02.204936 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:02.204848 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:27:02.207564 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:02.207539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhtx4\" (UniqueName: \"kubernetes.io/projected/562ff80c-46f9-46ea-bfc9-cacccd0662db-kube-api-access-fhtx4\") pod \"network-check-target-bg9mt\" (UID: \"562ff80c-46f9-46ea-bfc9-cacccd0662db\") " pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:27:02.240358 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:02.240326 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:27:02.476001 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:02.475974 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bg9mt"] Apr 24 21:27:02.479561 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:02.479534 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod562ff80c_46f9_46ea_bfc9_cacccd0662db.slice/crio-7ea04038e1600042e231f472e07d4a6c0a2f87f6c7c6dc5e3c197af628d0e17b WatchSource:0}: Error finding container 7ea04038e1600042e231f472e07d4a6c0a2f87f6c7c6dc5e3c197af628d0e17b: Status 404 returned error can't find the container with id 7ea04038e1600042e231f472e07d4a6c0a2f87f6c7c6dc5e3c197af628d0e17b Apr 24 21:27:02.522081 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:02.522034 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerStarted","Data":"03f414bcd893a3626fec033dde581e4fccceb06712d5a0dc83ad5c7b6ba35e8b"} Apr 24 21:27:02.523172 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:02.523133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bg9mt" event={"ID":"562ff80c-46f9-46ea-bfc9-cacccd0662db","Type":"ContainerStarted","Data":"7ea04038e1600042e231f472e07d4a6c0a2f87f6c7c6dc5e3c197af628d0e17b"} Apr 24 21:27:03.527464 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:03.527429 2569 generic.go:358] "Generic (PLEG): container finished" podID="a34653fe-8931-4a96-adb4-518f9c93a246" containerID="03f414bcd893a3626fec033dde581e4fccceb06712d5a0dc83ad5c7b6ba35e8b" exitCode=0 Apr 24 21:27:03.527976 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:03.527479 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerDied","Data":"03f414bcd893a3626fec033dde581e4fccceb06712d5a0dc83ad5c7b6ba35e8b"} Apr 24 21:27:03.920776 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:03.920693 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:03.920776 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:03.920744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:03.920776 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:03.920772 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:03.921070 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:03.920859 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:03.921070 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:03.920891 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:03.921070 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:03.920904 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747cff9957-t7znk: secret "image-registry-tls" not found Apr 24 21:27:03.921070 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:03.920915 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:03.921070 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:03.920938 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert podName:867d54ac-7685-453e-ab98-671a28e06ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.92091343 +0000 UTC m=+39.161266669 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert") pod "ingress-canary-4ph6t" (UID: "867d54ac-7685-453e-ab98-671a28e06ea0") : secret "canary-serving-cert" not found Apr 24 21:27:03.921070 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:03.920968 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls podName:20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.920958284 +0000 UTC m=+39.161311526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls") pod "image-registry-747cff9957-t7znk" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d") : secret "image-registry-tls" not found Apr 24 21:27:03.921070 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:03.921037 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls podName:74d17f36-1527-485c-ad29-abd7e0f42a70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.921015748 +0000 UTC m=+39.161368989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls") pod "dns-default-h6bcf" (UID: "74d17f36-1527-485c-ad29-abd7e0f42a70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:04.532755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:04.532719 2569 generic.go:358] "Generic (PLEG): container finished" podID="a34653fe-8931-4a96-adb4-518f9c93a246" containerID="aa2a4e275c957d84569efe884df62885fdb2b524051342ae6e547a8573d32569" exitCode=0 Apr 24 21:27:04.533229 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:04.532790 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerDied","Data":"aa2a4e275c957d84569efe884df62885fdb2b524051342ae6e547a8573d32569"} Apr 24 21:27:04.726711 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:04.726652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:27:04.730251 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:04.730220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8179f9fe-0d63-49d2-84df-a9763b98a8c6-original-pull-secret\") pod \"global-pull-secret-syncer-9zcwj\" (UID: \"8179f9fe-0d63-49d2-84df-a9763b98a8c6\") " pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:27:04.949164 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:04.949079 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9zcwj" Apr 24 21:27:05.292714 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:05.292664 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9zcwj"] Apr 24 21:27:05.370277 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:05.370245 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8179f9fe_0d63_49d2_84df_a9763b98a8c6.slice/crio-31f5a73ad0e32fdc578957d9aa255fbcc78d62e7e6f17cc052ed163cf481fb6e WatchSource:0}: Error finding container 31f5a73ad0e32fdc578957d9aa255fbcc78d62e7e6f17cc052ed163cf481fb6e: Status 404 returned error can't find the container with id 31f5a73ad0e32fdc578957d9aa255fbcc78d62e7e6f17cc052ed163cf481fb6e Apr 24 21:27:05.537342 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:05.537191 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l64cf" event={"ID":"a34653fe-8931-4a96-adb4-518f9c93a246","Type":"ContainerStarted","Data":"32abc3303294c3151ccd9fad06b52c4c51187c59977230eea5870349a086c434"} Apr 24 21:27:05.538477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:05.538453 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bg9mt" event={"ID":"562ff80c-46f9-46ea-bfc9-cacccd0662db","Type":"ContainerStarted","Data":"735d536b8c1a74d642c87f324e540997fdd35a86bbf6e4ef0cdad34a95f19b21"} Apr 24 21:27:05.538579 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:05.538565 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:27:05.539368 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:05.539348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9zcwj" event={"ID":"8179f9fe-0d63-49d2-84df-a9763b98a8c6","Type":"ContainerStarted","Data":"31f5a73ad0e32fdc578957d9aa255fbcc78d62e7e6f17cc052ed163cf481fb6e"} Apr 24 21:27:05.565721 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:05.565663 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l64cf" podStartSLOduration=4.834330958 podStartE2EDuration="36.565651346s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:26:30.614843927 +0000 UTC m=+1.855197168" lastFinishedPulling="2026-04-24 21:27:02.346164317 +0000 UTC m=+33.586517556" observedRunningTime="2026-04-24 21:27:05.564159367 +0000 UTC m=+36.804512626" watchObservedRunningTime="2026-04-24 21:27:05.565651346 +0000 UTC m=+36.806004605" Apr 24 21:27:05.583109 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:05.583069 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bg9mt" podStartSLOduration=33.674768892 podStartE2EDuration="36.583060581s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:02.48159932 +0000 UTC m=+33.721952564" lastFinishedPulling="2026-04-24 21:27:05.389891012 +0000 UTC m=+36.630244253" observedRunningTime="2026-04-24 21:27:05.581727702 +0000 UTC m=+36.822080962" watchObservedRunningTime="2026-04-24 21:27:05.583060581 +0000 UTC m=+36.823413838" Apr 24 21:27:07.953503 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:07.953465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:07.953521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:07.953558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:07.953641 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:07.953659 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:07.953663 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:07.953701 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747cff9957-t7znk: secret "image-registry-tls" not found Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:07.953731 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert podName:867d54ac-7685-453e-ab98-671a28e06ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.953711477 +0000 UTC m=+47.194064716 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert") pod "ingress-canary-4ph6t" (UID: "867d54ac-7685-453e-ab98-671a28e06ea0") : secret "canary-serving-cert" not found Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:07.953747 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls podName:74d17f36-1527-485c-ad29-abd7e0f42a70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.953741088 +0000 UTC m=+47.194094327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls") pod "dns-default-h6bcf" (UID: "74d17f36-1527-485c-ad29-abd7e0f42a70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:07.954020 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:07.953759 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls podName:20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.953754031 +0000 UTC m=+47.194107269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls") pod "image-registry-747cff9957-t7znk" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d") : secret "image-registry-tls" not found Apr 24 21:27:09.548896 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:09.548818 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9zcwj" event={"ID":"8179f9fe-0d63-49d2-84df-a9763b98a8c6","Type":"ContainerStarted","Data":"df037bc86c5fc532aea2b9a9323ac3049fe9f2076287782c166f16be9a4241c1"} Apr 24 21:27:09.572253 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:09.572199 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9zcwj" podStartSLOduration=8.658144583 podStartE2EDuration="12.572184398s" podCreationTimestamp="2026-04-24 21:26:57 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.37948954 +0000 UTC m=+36.619842778" lastFinishedPulling="2026-04-24 21:27:09.293529355 +0000 UTC m=+40.533882593" observedRunningTime="2026-04-24 21:27:09.571827681 +0000 UTC m=+40.812180946" watchObservedRunningTime="2026-04-24 21:27:09.572184398 +0000 UTC m=+40.812537658" Apr 24 21:27:13.038873 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.038839 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9"] Apr 24 21:27:13.061046 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.061017 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hkcmr_3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8/dns-node-resolver/0.log" Apr 24 21:27:13.068988 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.068964 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9"] Apr 24 21:27:13.069115 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.069077 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" Apr 24 21:27:13.072481 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.072459 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:13.072600 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.072460 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:27:13.074773 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.074750 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-hcgcv\"" Apr 24 21:27:13.188653 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.188617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhv94\" (UniqueName: \"kubernetes.io/projected/59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c-kube-api-access-rhv94\") pod \"migrator-74bb7799d9-kk8l9\" (UID: \"59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" Apr 24 21:27:13.289254 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.289166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhv94\" (UniqueName: \"kubernetes.io/projected/59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c-kube-api-access-rhv94\") pod \"migrator-74bb7799d9-kk8l9\" (UID: \"59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" Apr 24 21:27:13.298311 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.298284 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhv94\" (UniqueName: \"kubernetes.io/projected/59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c-kube-api-access-rhv94\") pod \"migrator-74bb7799d9-kk8l9\" (UID: \"59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" Apr 24 21:27:13.377356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.377323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" Apr 24 21:27:13.498206 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.498175 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9"] Apr 24 21:27:13.500664 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:13.500635 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59df3ff1_21a2_4ad6_aea0_e7c4cd6aee2c.slice/crio-aa0ad7f2084936644e000fbf471c295305bfa85719462db32e80e4a5adcb7f7f WatchSource:0}: Error finding container aa0ad7f2084936644e000fbf471c295305bfa85719462db32e80e4a5adcb7f7f: Status 404 returned error can't find the container with id aa0ad7f2084936644e000fbf471c295305bfa85719462db32e80e4a5adcb7f7f Apr 24 21:27:13.555709 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:13.555611 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" event={"ID":"59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c","Type":"ContainerStarted","Data":"aa0ad7f2084936644e000fbf471c295305bfa85719462db32e80e4a5adcb7f7f"} Apr 24 21:27:14.049534 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.049512 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9pchk_907a584d-065c-4c42-a7ff-3db1f2519bf9/node-ca/0.log" Apr 24 21:27:14.388434 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.388346 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hbpl5"] Apr 24 21:27:14.407737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.407703 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hbpl5"] Apr 24 21:27:14.407888 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.407807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.410602 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.410576 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-6sk64\"" Apr 24 21:27:14.410789 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.410770 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:27:14.412038 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.412011 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:27:14.412143 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.412063 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:27:14.412143 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.412113 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:27:14.499216 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.499177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-signing-cabundle\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.499367 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.499222 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c87cn\" (UniqueName: \"kubernetes.io/projected/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-kube-api-access-c87cn\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.499367 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.499254 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-signing-key\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.599958 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.599918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-signing-cabundle\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.599958 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.599955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c87cn\" (UniqueName: \"kubernetes.io/projected/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-kube-api-access-c87cn\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.600167 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.599975 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-signing-key\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.600736 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.600717 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-signing-cabundle\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.602563 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.602540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-signing-key\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.608399 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.608373 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c87cn\" (UniqueName: \"kubernetes.io/projected/c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3-kube-api-access-c87cn\") pod \"service-ca-865cb79987-hbpl5\" (UID: \"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3\") " pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:14.718519 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:14.718434 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-hbpl5" Apr 24 21:27:15.018263 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:15.018234 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-hbpl5"] Apr 24 21:27:15.033890 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:15.033865 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2fb1c7b_7bcf_46ba_acc2_3b3111ea8da3.slice/crio-ed6d9cdd0a2622e574bfc85c2c98d2ad12daa8220ac05e0a1b6e1b24b3cd8018 WatchSource:0}: Error finding container ed6d9cdd0a2622e574bfc85c2c98d2ad12daa8220ac05e0a1b6e1b24b3cd8018: Status 404 returned error can't find the container with id ed6d9cdd0a2622e574bfc85c2c98d2ad12daa8220ac05e0a1b6e1b24b3cd8018 Apr 24 21:27:15.561417 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:15.561382 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" event={"ID":"59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c","Type":"ContainerStarted","Data":"2fd5a3419c6a665113c6774b4e02a7a31e7eb5acfa374983ea65a869c535419b"} Apr 24 21:27:15.561417 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:15.561427 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" event={"ID":"59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c","Type":"ContainerStarted","Data":"705cd69aaeb980c0783cffc59331543402382006a5ac4212516ebc1da1cad47a"} Apr 24 21:27:15.562614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:15.562583 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hbpl5" event={"ID":"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3","Type":"ContainerStarted","Data":"ed6d9cdd0a2622e574bfc85c2c98d2ad12daa8220ac05e0a1b6e1b24b3cd8018"} Apr 24 21:27:16.011643 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:16.011604 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:16.011837 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:16.011712 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:16.011837 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:16.011797 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:16.011945 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:16.011829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:16.011945 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:16.011797 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:16.011945 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:16.011863 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-747cff9957-t7znk: secret "image-registry-tls" not found Apr 24 21:27:16.011945 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:16.011865 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert podName:867d54ac-7685-453e-ab98-671a28e06ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.011847299 +0000 UTC m=+63.252200556 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert") pod "ingress-canary-4ph6t" (UID: "867d54ac-7685-453e-ab98-671a28e06ea0") : secret "canary-serving-cert" not found Apr 24 21:27:16.011945 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:16.011929 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls podName:20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.011913176 +0000 UTC m=+63.252266420 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls") pod "image-registry-747cff9957-t7znk" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d") : secret "image-registry-tls" not found Apr 24 21:27:16.012154 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:16.011943 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:16.012154 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:27:16.012003 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls podName:74d17f36-1527-485c-ad29-abd7e0f42a70 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.011987479 +0000 UTC m=+63.252340731 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls") pod "dns-default-h6bcf" (UID: "74d17f36-1527-485c-ad29-abd7e0f42a70") : secret "dns-default-metrics-tls" not found Apr 24 21:27:17.567962 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:17.567930 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-hbpl5" event={"ID":"c2fb1c7b-7bcf-46ba-acc2-3b3111ea8da3","Type":"ContainerStarted","Data":"0709fe34b6e3601eb28536999ce8b07d71e2ad4851c41b23e5ab7e0fd0e08e56"} Apr 24 21:27:17.588182 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:17.588136 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-kk8l9" podStartSLOduration=2.841331747 podStartE2EDuration="4.58812284s" podCreationTimestamp="2026-04-24 21:27:13 +0000 UTC" firstStartedPulling="2026-04-24 21:27:13.502575621 +0000 UTC m=+44.742928859" lastFinishedPulling="2026-04-24 21:27:15.249366697 +0000 UTC m=+46.489719952" observedRunningTime="2026-04-24 21:27:15.589665022 +0000 UTC m=+46.830018283" watchObservedRunningTime="2026-04-24 21:27:17.58812284 +0000 UTC m=+48.828476172" Apr 24 21:27:17.588363 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:17.588224 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-hbpl5" podStartSLOduration=1.397732481 podStartE2EDuration="3.588218776s" podCreationTimestamp="2026-04-24 21:27:14 +0000 UTC" firstStartedPulling="2026-04-24 21:27:15.035635716 +0000 UTC m=+46.275988955" lastFinishedPulling="2026-04-24 21:27:17.226122009 +0000 UTC m=+48.466475250" observedRunningTime="2026-04-24 21:27:17.587451978 +0000 UTC m=+48.827805239" watchObservedRunningTime="2026-04-24 21:27:17.588218776 +0000 UTC m=+48.828572037" Apr 24 21:27:27.522063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:27.522036 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jgb8f" Apr 24 21:27:32.036078 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.036043 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:32.036525 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.036099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:32.036525 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.036121 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:32.038486 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.038464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74d17f36-1527-485c-ad29-abd7e0f42a70-metrics-tls\") pod \"dns-default-h6bcf\" (UID: \"74d17f36-1527-485c-ad29-abd7e0f42a70\") " pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:32.038787 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.038766 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"image-registry-747cff9957-t7znk\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:32.038835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.038768 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/867d54ac-7685-453e-ab98-671a28e06ea0-cert\") pod \"ingress-canary-4ph6t\" (UID: \"867d54ac-7685-453e-ab98-671a28e06ea0\") " pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:32.058262 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.058239 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fgjf8\"" Apr 24 21:27:32.065765 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.065749 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4ph6t" Apr 24 21:27:32.179318 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.179292 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4ph6t"] Apr 24 21:27:32.181746 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:32.181711 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod867d54ac_7685_453e_ab98_671a28e06ea0.slice/crio-1a2a894225190fdf256c04adc80ceb81dfea0ee7f092e69c81610ffd732f04ad WatchSource:0}: Error finding container 1a2a894225190fdf256c04adc80ceb81dfea0ee7f092e69c81610ffd732f04ad: Status 404 returned error can't find the container with id 1a2a894225190fdf256c04adc80ceb81dfea0ee7f092e69c81610ffd732f04ad Apr 24 21:27:32.311015 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.310935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qbk7c\"" Apr 24 21:27:32.319223 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.319200 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:32.334877 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.334857 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfvbj\"" Apr 24 21:27:32.342821 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.342795 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:32.440799 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.440767 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-747cff9957-t7znk"] Apr 24 21:27:32.450695 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:32.450643 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c7bcc1_ce93_4ec1_9668_3ce24cd22b6d.slice/crio-f134489e1035118ab4f770860bb3522ae76ec031acea9609103e591abde82b44 WatchSource:0}: Error finding container f134489e1035118ab4f770860bb3522ae76ec031acea9609103e591abde82b44: Status 404 returned error can't find the container with id f134489e1035118ab4f770860bb3522ae76ec031acea9609103e591abde82b44 Apr 24 21:27:32.476181 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.476156 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h6bcf"] Apr 24 21:27:32.480598 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:32.480573 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d17f36_1527_485c_ad29_abd7e0f42a70.slice/crio-1ffa121ceea45c458f349c1595fda3ee688cd2d5939bfbb4afc0b9fd4cf5abfe WatchSource:0}: Error finding container 1ffa121ceea45c458f349c1595fda3ee688cd2d5939bfbb4afc0b9fd4cf5abfe: Status 404 returned error can't find the container with id 1ffa121ceea45c458f349c1595fda3ee688cd2d5939bfbb4afc0b9fd4cf5abfe Apr 24 21:27:32.603260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.603179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747cff9957-t7znk" event={"ID":"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d","Type":"ContainerStarted","Data":"e1110a5ff699d230a5fa91a09a13e67e86b2a36459ef3aceef10b4a7b06b50b7"} Apr 24 21:27:32.603260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.603227 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747cff9957-t7znk" event={"ID":"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d","Type":"ContainerStarted","Data":"f134489e1035118ab4f770860bb3522ae76ec031acea9609103e591abde82b44"} Apr 24 21:27:32.603441 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.603293 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:32.604471 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.604448 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h6bcf" event={"ID":"74d17f36-1527-485c-ad29-abd7e0f42a70","Type":"ContainerStarted","Data":"1ffa121ceea45c458f349c1595fda3ee688cd2d5939bfbb4afc0b9fd4cf5abfe"} Apr 24 21:27:32.605650 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.605624 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4ph6t" event={"ID":"867d54ac-7685-453e-ab98-671a28e06ea0","Type":"ContainerStarted","Data":"1a2a894225190fdf256c04adc80ceb81dfea0ee7f092e69c81610ffd732f04ad"} Apr 24 21:27:32.622956 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:32.622912 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-747cff9957-t7znk" podStartSLOduration=63.622899386 podStartE2EDuration="1m3.622899386s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:32.622321226 +0000 UTC m=+63.862674496" watchObservedRunningTime="2026-04-24 21:27:32.622899386 +0000 UTC m=+63.863252643" Apr 24 21:27:34.153433 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.153398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:27:34.156355 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.156325 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ac4e3b6-523e-4f41-98be-ceb879813ac3-metrics-certs\") pod \"network-metrics-daemon-7dwzn\" (UID: \"7ac4e3b6-523e-4f41-98be-ceb879813ac3\") " pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:27:34.356263 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.356231 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vd7zr\"" Apr 24 21:27:34.364450 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.364427 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7dwzn" Apr 24 21:27:34.419242 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.419181 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s"] Apr 24 21:27:34.434160 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.434137 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7"] Apr 24 21:27:34.434306 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.434291 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.437487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.437176 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:27:34.437487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.437196 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:27:34.437774 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.437753 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:27:34.438006 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.437987 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:27:34.455722 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.455697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-tmp\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.455818 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.455750 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-klusterlet-config\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.455873 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.455812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rm2w\" (UniqueName: \"kubernetes.io/projected/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-kube-api-access-2rm2w\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.461450 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.461434 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7"] Apr 24 21:27:34.461535 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.461520 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s"] Apr 24 21:27:34.461582 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.461522 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" Apr 24 21:27:34.464780 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.464760 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-2c4tc\"" Apr 24 21:27:34.465097 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.465078 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:27:34.465268 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.465254 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-747cff9957-t7znk"] Apr 24 21:27:34.531875 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.531851 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-w28n7"] Apr 24 21:27:34.549816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.549794 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-w28n7"] Apr 24 21:27:34.549951 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.549929 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.556228 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27xv\" (UniqueName: \"kubernetes.io/projected/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-kube-api-access-k27xv\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.556360 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-klusterlet-config\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.556429 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-crio-socket\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.556429 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.556518 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.556570 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556524 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rm2w\" (UniqueName: \"kubernetes.io/projected/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-kube-api-access-2rm2w\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.556618 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556574 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v74w\" (UniqueName: \"kubernetes.io/projected/c8f1d732-be21-403c-91d6-d45eb0f2afef-kube-api-access-9v74w\") pod \"managed-serviceaccount-addon-agent-778cb564df-w9lf7\" (UID: \"c8f1d732-be21-403c-91d6-d45eb0f2afef\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" Apr 24 21:27:34.556618 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-data-volume\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.556732 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c8f1d732-be21-403c-91d6-d45eb0f2afef-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-778cb564df-w9lf7\" (UID: \"c8f1d732-be21-403c-91d6-d45eb0f2afef\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" Apr 24 21:27:34.556732 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.556652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-tmp\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.557088 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.557065 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-tmp\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.558894 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.558876 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-klusterlet-config\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.559258 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.559244 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-85klr\"" Apr 24 21:27:34.559525 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.559507 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:27:34.559632 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.559523 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:27:34.559791 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.559776 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:27:34.564904 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.564887 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:27:34.578338 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.578320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rm2w\" (UniqueName: \"kubernetes.io/projected/9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca-kube-api-access-2rm2w\") pod \"klusterlet-addon-workmgr-98894b8fb-l458s\" (UID: \"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.614314 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.614290 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7859dbcb8d-bfkg5"] Apr 24 21:27:34.638180 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.638158 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7859dbcb8d-bfkg5"] Apr 24 21:27:34.638324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.638272 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657270 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.657388 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.657388 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12bfbd31-9b7c-48ae-b6fd-469c93decd67-ca-trust-extracted\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657388 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657361 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-registry-tls\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657547 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v74w\" (UniqueName: \"kubernetes.io/projected/c8f1d732-be21-403c-91d6-d45eb0f2afef-kube-api-access-9v74w\") pod \"managed-serviceaccount-addon-agent-778cb564df-w9lf7\" (UID: \"c8f1d732-be21-403c-91d6-d45eb0f2afef\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" Apr 24 21:27:34.657547 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657492 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-data-volume\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.657638 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657556 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12bfbd31-9b7c-48ae-b6fd-469c93decd67-trusted-ca\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657638 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12bfbd31-9b7c-48ae-b6fd-469c93decd67-installation-pull-secrets\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657638 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657618 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknnr\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-kube-api-access-mknnr\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657662 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c8f1d732-be21-403c-91d6-d45eb0f2afef-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-778cb564df-w9lf7\" (UID: \"c8f1d732-be21-403c-91d6-d45eb0f2afef\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" Apr 24 21:27:34.657840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12bfbd31-9b7c-48ae-b6fd-469c93decd67-image-registry-private-configuration\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657734 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-bound-sa-token\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k27xv\" (UniqueName: \"kubernetes.io/projected/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-kube-api-access-k27xv\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.657840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657784 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-data-volume\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.657840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12bfbd31-9b7c-48ae-b6fd-469c93decd67-registry-certificates\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.657840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-crio-socket\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.657840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657838 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.658169 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.657915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-crio-socket\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.659968 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.659944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.660056 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.660026 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c8f1d732-be21-403c-91d6-d45eb0f2afef-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-778cb564df-w9lf7\" (UID: \"c8f1d732-be21-403c-91d6-d45eb0f2afef\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" Apr 24 21:27:34.679928 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.679864 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v74w\" (UniqueName: \"kubernetes.io/projected/c8f1d732-be21-403c-91d6-d45eb0f2afef-kube-api-access-9v74w\") pod \"managed-serviceaccount-addon-agent-778cb564df-w9lf7\" (UID: \"c8f1d732-be21-403c-91d6-d45eb0f2afef\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" Apr 24 21:27:34.680241 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.680225 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27xv\" (UniqueName: \"kubernetes.io/projected/2231f388-e1f5-4ecf-b03c-f5f6c48450c3-kube-api-access-k27xv\") pod \"insights-runtime-extractor-w28n7\" (UID: \"2231f388-e1f5-4ecf-b03c-f5f6c48450c3\") " pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.746422 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.746399 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:34.758898 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.758869 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-registry-tls\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.758993 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.758910 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12bfbd31-9b7c-48ae-b6fd-469c93decd67-trusted-ca\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.758993 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.758943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12bfbd31-9b7c-48ae-b6fd-469c93decd67-installation-pull-secrets\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.759148 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.759125 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mknnr\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-kube-api-access-mknnr\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.759215 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.759191 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12bfbd31-9b7c-48ae-b6fd-469c93decd67-image-registry-private-configuration\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.759270 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.759223 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-bound-sa-token\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.759319 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.759292 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12bfbd31-9b7c-48ae-b6fd-469c93decd67-registry-certificates\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.759876 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.759377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12bfbd31-9b7c-48ae-b6fd-469c93decd67-ca-trust-extracted\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.759876 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.759811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/12bfbd31-9b7c-48ae-b6fd-469c93decd67-ca-trust-extracted\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.760034 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.759906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12bfbd31-9b7c-48ae-b6fd-469c93decd67-trusted-ca\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.760432 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.760390 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/12bfbd31-9b7c-48ae-b6fd-469c93decd67-registry-certificates\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.761700 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.761654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/12bfbd31-9b7c-48ae-b6fd-469c93decd67-installation-pull-secrets\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.761875 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.761857 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-registry-tls\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.762249 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.762229 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/12bfbd31-9b7c-48ae-b6fd-469c93decd67-image-registry-private-configuration\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.776110 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.776085 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknnr\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-kube-api-access-mknnr\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.780911 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.780508 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12bfbd31-9b7c-48ae-b6fd-469c93decd67-bound-sa-token\") pod \"image-registry-7859dbcb8d-bfkg5\" (UID: \"12bfbd31-9b7c-48ae-b6fd-469c93decd67\") " pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.782637 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.782322 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" Apr 24 21:27:34.858909 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.858539 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-w28n7" Apr 24 21:27:34.887438 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.887409 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7dwzn"] Apr 24 21:27:34.927527 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.927498 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s"] Apr 24 21:27:34.931499 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:34.931372 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ba4aa1b_ee26_4c4e_aa9d_19fdc8329dca.slice/crio-f4a015c1873941ca43b33849dcfdac8864658d22650a6b08c557c707c8ac75ac WatchSource:0}: Error finding container f4a015c1873941ca43b33849dcfdac8864658d22650a6b08c557c707c8ac75ac: Status 404 returned error can't find the container with id f4a015c1873941ca43b33849dcfdac8864658d22650a6b08c557c707c8ac75ac Apr 24 21:27:34.946520 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.946371 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:34.965814 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:34.965784 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7"] Apr 24 21:27:35.039362 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.039308 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-w28n7"] Apr 24 21:27:35.110721 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.110525 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7859dbcb8d-bfkg5"] Apr 24 21:27:35.113557 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:35.113522 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12bfbd31_9b7c_48ae_b6fd_469c93decd67.slice/crio-3f59572174d011da736e362cc6bcbda4e92474d1dfd9a1219864fcddea3f0e4f WatchSource:0}: Error finding container 3f59572174d011da736e362cc6bcbda4e92474d1dfd9a1219864fcddea3f0e4f: Status 404 returned error can't find the container with id 3f59572174d011da736e362cc6bcbda4e92474d1dfd9a1219864fcddea3f0e4f Apr 24 21:27:35.616393 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.616351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" event={"ID":"12bfbd31-9b7c-48ae-b6fd-469c93decd67","Type":"ContainerStarted","Data":"257a1334a7ce4b8a4772d25d763ac3228dbfc0c02fca613aab0d386b665ca608"} Apr 24 21:27:35.616393 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.616395 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" event={"ID":"12bfbd31-9b7c-48ae-b6fd-469c93decd67","Type":"ContainerStarted","Data":"3f59572174d011da736e362cc6bcbda4e92474d1dfd9a1219864fcddea3f0e4f"} Apr 24 21:27:35.616867 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.616489 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:35.618099 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.618074 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h6bcf" event={"ID":"74d17f36-1527-485c-ad29-abd7e0f42a70","Type":"ContainerStarted","Data":"3985411308e34b839eb7f6a0c1b82bc5fc9514a85e5ca41e985035e318673839"} Apr 24 21:27:35.618099 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.618099 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h6bcf" event={"ID":"74d17f36-1527-485c-ad29-abd7e0f42a70","Type":"ContainerStarted","Data":"d9c950063bd7fb2329e2a39afdfbe45ef3d7371318f547e85da8a532e7f4b834"} Apr 24 21:27:35.618278 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.618193 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:35.619484 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.619459 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4ph6t" event={"ID":"867d54ac-7685-453e-ab98-671a28e06ea0","Type":"ContainerStarted","Data":"33b72fd5ef52a7b1088e6cd9f374eb7a0805a2dc6c3a6247f19a36d63f79aace"} Apr 24 21:27:35.620872 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.620852 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w28n7" event={"ID":"2231f388-e1f5-4ecf-b03c-f5f6c48450c3","Type":"ContainerStarted","Data":"708b9ddb100ffb91d09a3062ee89283463c4f9242c4dec3065c1c52ef3c4c115"} Apr 24 21:27:35.620872 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.620878 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w28n7" event={"ID":"2231f388-e1f5-4ecf-b03c-f5f6c48450c3","Type":"ContainerStarted","Data":"33aacce4a7a97286e53e424435559dbd53af93f94189ef6b8aa8a46a40045c83"} Apr 24 21:27:35.622013 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.621991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" event={"ID":"c8f1d732-be21-403c-91d6-d45eb0f2afef","Type":"ContainerStarted","Data":"6186c26e7a8fed4e798725af12c1a91d69be4adfae564ef15a20dbfc139b3d4b"} Apr 24 21:27:35.622952 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.622931 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" event={"ID":"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca","Type":"ContainerStarted","Data":"f4a015c1873941ca43b33849dcfdac8864658d22650a6b08c557c707c8ac75ac"} Apr 24 21:27:35.623906 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.623886 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7dwzn" event={"ID":"7ac4e3b6-523e-4f41-98be-ceb879813ac3","Type":"ContainerStarted","Data":"802b1cddb834ab67a2bd03f5892c81061f817012ca89adf92f21287d8b1f37a4"} Apr 24 21:27:35.637162 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.637116 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" podStartSLOduration=1.637103328 podStartE2EDuration="1.637103328s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:35.636227004 +0000 UTC m=+66.876580264" watchObservedRunningTime="2026-04-24 21:27:35.637103328 +0000 UTC m=+66.877456566" Apr 24 21:27:35.658060 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.658014 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4ph6t" podStartSLOduration=33.105040959 podStartE2EDuration="35.657996859s" podCreationTimestamp="2026-04-24 21:27:00 +0000 UTC" firstStartedPulling="2026-04-24 21:27:32.183631548 +0000 UTC m=+63.423984786" lastFinishedPulling="2026-04-24 21:27:34.73658744 +0000 UTC m=+65.976940686" observedRunningTime="2026-04-24 21:27:35.656177038 +0000 UTC m=+66.896530299" watchObservedRunningTime="2026-04-24 21:27:35.657996859 +0000 UTC m=+66.898350119" Apr 24 21:27:35.674389 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:35.674340 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h6bcf" podStartSLOduration=33.417568366 podStartE2EDuration="35.67432342s" podCreationTimestamp="2026-04-24 21:27:00 +0000 UTC" firstStartedPulling="2026-04-24 21:27:32.48246854 +0000 UTC m=+63.722821778" lastFinishedPulling="2026-04-24 21:27:34.739223581 +0000 UTC m=+65.979576832" observedRunningTime="2026-04-24 21:27:35.673946959 +0000 UTC m=+66.914300219" watchObservedRunningTime="2026-04-24 21:27:35.67432342 +0000 UTC m=+66.914676681" Apr 24 21:27:36.543261 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:36.543233 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bg9mt" Apr 24 21:27:38.153989 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.153957 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h"] Apr 24 21:27:38.157099 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.157076 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" Apr 24 21:27:38.159834 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.159807 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:27:38.159955 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.159906 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-2ggb2\"" Apr 24 21:27:38.166394 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.166374 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h"] Apr 24 21:27:38.190203 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.190170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ffd7379c-7840-42d9-9eaa-f4fdd5edf13c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wqj4h\" (UID: \"ffd7379c-7840-42d9-9eaa-f4fdd5edf13c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" Apr 24 21:27:38.291138 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.291099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ffd7379c-7840-42d9-9eaa-f4fdd5edf13c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wqj4h\" (UID: \"ffd7379c-7840-42d9-9eaa-f4fdd5edf13c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" Apr 24 21:27:38.293597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.293573 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ffd7379c-7840-42d9-9eaa-f4fdd5edf13c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-wqj4h\" (UID: \"ffd7379c-7840-42d9-9eaa-f4fdd5edf13c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" Apr 24 21:27:38.465260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.465181 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" Apr 24 21:27:38.631895 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.631842 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h"] Apr 24 21:27:38.635867 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:38.635838 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffd7379c_7840_42d9_9eaa_f4fdd5edf13c.slice/crio-8c20851eddd2c0b6c90738b018569e743355884ba87f9eadb1ad8516b1e913b9 WatchSource:0}: Error finding container 8c20851eddd2c0b6c90738b018569e743355884ba87f9eadb1ad8516b1e913b9: Status 404 returned error can't find the container with id 8c20851eddd2c0b6c90738b018569e743355884ba87f9eadb1ad8516b1e913b9 Apr 24 21:27:38.637262 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.636821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w28n7" event={"ID":"2231f388-e1f5-4ecf-b03c-f5f6c48450c3","Type":"ContainerStarted","Data":"05b952f406e325ddc9f3d142f034e9528e1ff60b88077585e8a7b81e21f2fb1f"} Apr 24 21:27:38.638936 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.638899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" event={"ID":"c8f1d732-be21-403c-91d6-d45eb0f2afef","Type":"ContainerStarted","Data":"12899b1514e55779e8aa50532fdc14b415304a1f535cf50f13f86679d390aa40"} Apr 24 21:27:38.641411 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.641387 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7dwzn" event={"ID":"7ac4e3b6-523e-4f41-98be-ceb879813ac3","Type":"ContainerStarted","Data":"cf13af82b31c5ff9d16b250b25c723a9c9a2bc71261d4efe5837807491146eb4"} Apr 24 21:27:38.641518 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.641508 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7dwzn" event={"ID":"7ac4e3b6-523e-4f41-98be-ceb879813ac3","Type":"ContainerStarted","Data":"0c7d1514837976512af0402a4a28bd557195fc8251f4f079a8272bc8ab134f14"} Apr 24 21:27:38.663968 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.663919 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-778cb564df-w9lf7" podStartSLOduration=1.724621112 podStartE2EDuration="4.663906901s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:34.974497461 +0000 UTC m=+66.214850711" lastFinishedPulling="2026-04-24 21:27:37.913783257 +0000 UTC m=+69.154136500" observedRunningTime="2026-04-24 21:27:38.66234868 +0000 UTC m=+69.902701940" watchObservedRunningTime="2026-04-24 21:27:38.663906901 +0000 UTC m=+69.904260161" Apr 24 21:27:38.667565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.667540 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-bwk94"] Apr 24 21:27:38.672217 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.672197 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-bwk94" Apr 24 21:27:38.675902 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.675761 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:27:38.676181 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.676012 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:27:38.676334 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.676315 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5rbmf\"" Apr 24 21:27:38.682584 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.682545 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-bwk94"] Apr 24 21:27:38.687956 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.687907 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7dwzn" podStartSLOduration=66.735979249 podStartE2EDuration="1m9.687890261s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:34.895372816 +0000 UTC m=+66.135726062" lastFinishedPulling="2026-04-24 21:27:37.84728382 +0000 UTC m=+69.087637074" observedRunningTime="2026-04-24 21:27:38.687033184 +0000 UTC m=+69.927386449" watchObservedRunningTime="2026-04-24 21:27:38.687890261 +0000 UTC m=+69.928243522" Apr 24 21:27:38.694441 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.694417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2cxs\" (UniqueName: \"kubernetes.io/projected/5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b-kube-api-access-k2cxs\") pod \"downloads-6bcc868b7-bwk94\" (UID: \"5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b\") " pod="openshift-console/downloads-6bcc868b7-bwk94" Apr 24 21:27:38.795595 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.795556 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2cxs\" (UniqueName: \"kubernetes.io/projected/5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b-kube-api-access-k2cxs\") pod \"downloads-6bcc868b7-bwk94\" (UID: \"5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b\") " pod="openshift-console/downloads-6bcc868b7-bwk94" Apr 24 21:27:38.805291 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.805268 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2cxs\" (UniqueName: \"kubernetes.io/projected/5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b-kube-api-access-k2cxs\") pod \"downloads-6bcc868b7-bwk94\" (UID: \"5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b\") " pod="openshift-console/downloads-6bcc868b7-bwk94" Apr 24 21:27:38.983583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:38.983465 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-bwk94" Apr 24 21:27:39.130568 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:39.130471 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-bwk94"] Apr 24 21:27:39.521532 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:39.521499 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bcc1f5d_07ac_458a_bbdd_0e8a5bbc5f5b.slice/crio-53676b64ef884f3a0348cde2f3f6d412cffb3a71a38946f07a69da47d5f722be WatchSource:0}: Error finding container 53676b64ef884f3a0348cde2f3f6d412cffb3a71a38946f07a69da47d5f722be: Status 404 returned error can't find the container with id 53676b64ef884f3a0348cde2f3f6d412cffb3a71a38946f07a69da47d5f722be Apr 24 21:27:39.645653 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:39.645614 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-bwk94" event={"ID":"5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b","Type":"ContainerStarted","Data":"53676b64ef884f3a0348cde2f3f6d412cffb3a71a38946f07a69da47d5f722be"} Apr 24 21:27:39.646801 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:39.646767 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" event={"ID":"ffd7379c-7840-42d9-9eaa-f4fdd5edf13c","Type":"ContainerStarted","Data":"8c20851eddd2c0b6c90738b018569e743355884ba87f9eadb1ad8516b1e913b9"} Apr 24 21:27:41.655801 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.655763 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w28n7" event={"ID":"2231f388-e1f5-4ecf-b03c-f5f6c48450c3","Type":"ContainerStarted","Data":"72e1e7f70f791160e6a9e5198a71842780e05f2d19951ac3ba754c81e7f92475"} Apr 24 21:27:41.657210 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.657186 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" event={"ID":"9ba4aa1b-ee26-4c4e-aa9d-19fdc8329dca","Type":"ContainerStarted","Data":"eabcbf56445ab1c3dc9831adff4c271458b9de63ac7c0d4b9c9441ec5d529b50"} Apr 24 21:27:41.657405 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.657387 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:41.658629 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.658603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" event={"ID":"ffd7379c-7840-42d9-9eaa-f4fdd5edf13c","Type":"ContainerStarted","Data":"3f6fef19eb9a1c61a80322550a7491fd86531a998f0e58daed933add49264f94"} Apr 24 21:27:41.658822 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.658803 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" Apr 24 21:27:41.659273 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.659243 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" Apr 24 21:27:41.665299 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.665280 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" Apr 24 21:27:41.677855 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.677813 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-w28n7" podStartSLOduration=1.882975231 podStartE2EDuration="7.677799376s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:35.203443516 +0000 UTC m=+66.443796768" lastFinishedPulling="2026-04-24 21:27:40.998267673 +0000 UTC m=+72.238620913" observedRunningTime="2026-04-24 21:27:41.67653847 +0000 UTC m=+72.916891730" watchObservedRunningTime="2026-04-24 21:27:41.677799376 +0000 UTC m=+72.918152640" Apr 24 21:27:41.700396 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:41.700354 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-wqj4h" podStartSLOduration=1.339929136 podStartE2EDuration="3.700341009s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:38.637858121 +0000 UTC m=+69.878211372" lastFinishedPulling="2026-04-24 21:27:40.998269992 +0000 UTC m=+72.238623245" observedRunningTime="2026-04-24 21:27:41.699127359 +0000 UTC m=+72.939480620" watchObservedRunningTime="2026-04-24 21:27:41.700341009 +0000 UTC m=+72.940694270" Apr 24 21:27:45.629785 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:45.629751 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h6bcf" Apr 24 21:27:45.649221 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:45.649167 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-98894b8fb-l458s" podStartSLOduration=5.572711491 podStartE2EDuration="11.649149582s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:34.934816503 +0000 UTC m=+66.175169745" lastFinishedPulling="2026-04-24 21:27:41.011254591 +0000 UTC m=+72.251607836" observedRunningTime="2026-04-24 21:27:41.717344255 +0000 UTC m=+72.957697527" watchObservedRunningTime="2026-04-24 21:27:45.649149582 +0000 UTC m=+76.889502843" Apr 24 21:27:47.700066 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.699775 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dmjqv"] Apr 24 21:27:47.703329 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.703297 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.706718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.706693 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:27:47.707828 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.707799 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:27:47.708459 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.708439 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lx6q7\"" Apr 24 21:27:47.709466 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.709195 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:27:47.709466 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.709210 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:27:47.709466 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.709226 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:27:47.709466 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.709244 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:27:47.768337 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8abf106c-e808-4363-b918-dad844446b34-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.768486 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8abf106c-e808-4363-b918-dad844446b34-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.768486 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8abf106c-e808-4363-b918-dad844446b34-node-exporter-textfile\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.768486 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8abf106c-e808-4363-b918-dad844446b34-metrics-client-ca\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.768486 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768447 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-node-exporter-wtmp\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.768486 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768468 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8abf106c-e808-4363-b918-dad844446b34-node-exporter-tls\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.768709 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768492 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87mcg\" (UniqueName: \"kubernetes.io/projected/8abf106c-e808-4363-b918-dad844446b34-kube-api-access-87mcg\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.768709 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768513 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-sys\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.768709 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.768532 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-root\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869060 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.868928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8abf106c-e808-4363-b918-dad844446b34-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869060 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.868985 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8abf106c-e808-4363-b918-dad844446b34-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869060 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8abf106c-e808-4363-b918-dad844446b34-node-exporter-textfile\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869060 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869052 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8abf106c-e808-4363-b918-dad844446b34-metrics-client-ca\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-node-exporter-wtmp\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8abf106c-e808-4363-b918-dad844446b34-node-exporter-tls\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869127 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87mcg\" (UniqueName: \"kubernetes.io/projected/8abf106c-e808-4363-b918-dad844446b34-kube-api-access-87mcg\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869151 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-sys\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-root\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-node-exporter-wtmp\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869261 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-root\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869691 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869577 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8abf106c-e808-4363-b918-dad844446b34-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869691 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8abf106c-e808-4363-b918-dad844446b34-sys\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869824 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8abf106c-e808-4363-b918-dad844446b34-metrics-client-ca\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.869946 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.869927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8abf106c-e808-4363-b918-dad844446b34-node-exporter-textfile\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.871963 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.871943 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8abf106c-e808-4363-b918-dad844446b34-node-exporter-tls\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.872029 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.871963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8abf106c-e808-4363-b918-dad844446b34-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:47.878486 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:47.878465 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87mcg\" (UniqueName: \"kubernetes.io/projected/8abf106c-e808-4363-b918-dad844446b34-kube-api-access-87mcg\") pod \"node-exporter-dmjqv\" (UID: \"8abf106c-e808-4363-b918-dad844446b34\") " pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:48.014804 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:48.014773 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dmjqv" Apr 24 21:27:48.024327 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:48.024298 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8abf106c_e808_4363_b918_dad844446b34.slice/crio-2fc6ffc4ba0d6bf4c81a72d850247d958fac18ef39637f362c9420b1cfef4d01 WatchSource:0}: Error finding container 2fc6ffc4ba0d6bf4c81a72d850247d958fac18ef39637f362c9420b1cfef4d01: Status 404 returned error can't find the container with id 2fc6ffc4ba0d6bf4c81a72d850247d958fac18ef39637f362c9420b1cfef4d01 Apr 24 21:27:48.684089 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:48.684024 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmjqv" event={"ID":"8abf106c-e808-4363-b918-dad844446b34","Type":"ContainerStarted","Data":"2fc6ffc4ba0d6bf4c81a72d850247d958fac18ef39637f362c9420b1cfef4d01"} Apr 24 21:27:49.688624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:49.688583 2569 generic.go:358] "Generic (PLEG): container finished" podID="8abf106c-e808-4363-b918-dad844446b34" containerID="ea7e0e359175e7f9d35aeec54dcde52fedcb2c2bcc07d2f695115b40454a79d2" exitCode=0 Apr 24 21:27:49.689054 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:49.688664 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmjqv" event={"ID":"8abf106c-e808-4363-b918-dad844446b34","Type":"ContainerDied","Data":"ea7e0e359175e7f9d35aeec54dcde52fedcb2c2bcc07d2f695115b40454a79d2"} Apr 24 21:27:54.621074 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:54.621023 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:27:56.632125 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:56.632093 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7859dbcb8d-bfkg5" Apr 24 21:27:56.710936 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:56.710901 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-bwk94" event={"ID":"5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b","Type":"ContainerStarted","Data":"b8e13724d24f994ee3f2d3e21790bf57d06d3847b64e6dbf896e20018aea53a9"} Apr 24 21:27:56.711542 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:56.711238 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-bwk94" Apr 24 21:27:56.713240 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:56.713214 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmjqv" event={"ID":"8abf106c-e808-4363-b918-dad844446b34","Type":"ContainerStarted","Data":"8de99130ac12c84d8bb7dbf1127d17fad9e5329431f0577d322eea6ba2940a50"} Apr 24 21:27:56.713358 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:56.713244 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmjqv" event={"ID":"8abf106c-e808-4363-b918-dad844446b34","Type":"ContainerStarted","Data":"15cbfdd8353dea9e79bf34cc004bf4382a4076b45deace8ba53766c14ec5585c"} Apr 24 21:27:56.722726 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:56.722701 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-bwk94" Apr 24 21:27:56.738543 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:56.738495 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-bwk94" podStartSLOduration=2.063852272 podStartE2EDuration="18.738476307s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:39.523743065 +0000 UTC m=+70.764096303" lastFinishedPulling="2026-04-24 21:27:56.198367092 +0000 UTC m=+87.438720338" observedRunningTime="2026-04-24 21:27:56.736248067 +0000 UTC m=+87.976601350" watchObservedRunningTime="2026-04-24 21:27:56.738476307 +0000 UTC m=+87.978829568" Apr 24 21:27:56.762221 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:56.762161 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dmjqv" podStartSLOduration=8.927631953 podStartE2EDuration="9.762144283s" podCreationTimestamp="2026-04-24 21:27:47 +0000 UTC" firstStartedPulling="2026-04-24 21:27:48.026263979 +0000 UTC m=+79.266617220" lastFinishedPulling="2026-04-24 21:27:48.860776312 +0000 UTC m=+80.101129550" observedRunningTime="2026-04-24 21:27:56.759738405 +0000 UTC m=+88.000091667" watchObservedRunningTime="2026-04-24 21:27:56.762144283 +0000 UTC m=+88.002497543" Apr 24 21:27:57.428506 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.428465 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66ccddbfc8-gq86x"] Apr 24 21:27:57.433400 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.433371 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.438995 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.438971 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:27:57.439123 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.438977 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:27:57.441683 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.441647 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:27:57.442515 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.442496 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bdp6p\"" Apr 24 21:27:57.442749 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.442729 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:27:57.443047 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.443033 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:27:57.446959 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.446933 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66ccddbfc8-gq86x"] Apr 24 21:27:57.449791 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.449773 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:27:57.553250 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.553212 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdgg\" (UniqueName: \"kubernetes.io/projected/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-kube-api-access-thdgg\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.553457 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.553263 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-serving-cert\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.553457 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.553353 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-trusted-ca-bundle\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.553457 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.553402 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-oauth-config\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.553616 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.553457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-service-ca\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.553616 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.553523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-config\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.553616 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.553564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-oauth-serving-cert\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.654106 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-trusted-ca-bundle\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.654554 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654119 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-oauth-config\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.654554 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-service-ca\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.654554 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-config\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.654554 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-oauth-serving-cert\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.654554 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thdgg\" (UniqueName: \"kubernetes.io/projected/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-kube-api-access-thdgg\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.654554 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654269 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-serving-cert\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.655005 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654962 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-service-ca\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.655131 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.654969 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-oauth-serving-cert\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.655131 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.655111 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-trusted-ca-bundle\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.655263 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.655243 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-config\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.657099 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.657077 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-oauth-config\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.657200 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.657173 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-serving-cert\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.673246 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.673216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdgg\" (UniqueName: \"kubernetes.io/projected/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-kube-api-access-thdgg\") pod \"console-66ccddbfc8-gq86x\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.744827 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.744736 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:27:57.917733 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:57.917687 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66ccddbfc8-gq86x"] Apr 24 21:27:57.921021 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:27:57.920976 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9f529f8_d9b9_4c93_8d7a_fc770c64e91a.slice/crio-a174117fb4807d2b771f8d51fe66e4ff26e4bd38bfb0dc0c41977eeae41729d7 WatchSource:0}: Error finding container a174117fb4807d2b771f8d51fe66e4ff26e4bd38bfb0dc0c41977eeae41729d7: Status 404 returned error can't find the container with id a174117fb4807d2b771f8d51fe66e4ff26e4bd38bfb0dc0c41977eeae41729d7 Apr 24 21:27:58.722054 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:58.721963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ccddbfc8-gq86x" event={"ID":"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a","Type":"ContainerStarted","Data":"a174117fb4807d2b771f8d51fe66e4ff26e4bd38bfb0dc0c41977eeae41729d7"} Apr 24 21:27:59.630380 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:27:59.630321 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-747cff9957-t7znk" podUID="20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" containerName="registry" containerID="cri-o://e1110a5ff699d230a5fa91a09a13e67e86b2a36459ef3aceef10b4a7b06b50b7" gracePeriod=30 Apr 24 21:28:00.730541 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:00.730499 2569 generic.go:358] "Generic (PLEG): container finished" podID="20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" containerID="e1110a5ff699d230a5fa91a09a13e67e86b2a36459ef3aceef10b4a7b06b50b7" exitCode=0 Apr 24 21:28:00.731107 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:00.730554 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747cff9957-t7znk" event={"ID":"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d","Type":"ContainerDied","Data":"e1110a5ff699d230a5fa91a09a13e67e86b2a36459ef3aceef10b4a7b06b50b7"} Apr 24 21:28:01.172766 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.172740 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:28:01.289447 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289360 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-trusted-ca\") pod \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " Apr 24 21:28:01.289447 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289412 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m7tw\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-kube-api-access-5m7tw\") pod \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " Apr 24 21:28:01.289447 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289441 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-bound-sa-token\") pod \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " Apr 24 21:28:01.289716 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289465 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") pod \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " Apr 24 21:28:01.289716 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289491 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-certificates\") pod \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " Apr 24 21:28:01.289716 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289534 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-installation-pull-secrets\") pod \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " Apr 24 21:28:01.289716 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289590 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-ca-trust-extracted\") pod \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " Apr 24 21:28:01.289716 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289629 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-image-registry-private-configuration\") pod \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\" (UID: \"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d\") " Apr 24 21:28:01.289891 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289799 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:01.289941 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.289898 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-trusted-ca\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:28:01.290149 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.290108 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:01.292813 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.292777 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:01.292987 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.292947 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-kube-api-access-5m7tw" (OuterVolumeSpecName: "kube-api-access-5m7tw") pod "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d"). InnerVolumeSpecName "kube-api-access-5m7tw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:01.293311 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.293266 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:01.294148 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.294123 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:01.294262 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.294240 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:01.300478 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.300452 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" (UID: "20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:01.390472 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.390435 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-ca-trust-extracted\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:28:01.390472 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.390470 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-image-registry-private-configuration\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:28:01.390472 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.390481 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5m7tw\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-kube-api-access-5m7tw\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:28:01.390718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.390493 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-bound-sa-token\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:28:01.390718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.390502 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-tls\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:28:01.390718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.390509 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-registry-certificates\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:28:01.390718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.390517 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d-installation-pull-secrets\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:28:01.735345 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.735311 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-747cff9957-t7znk" Apr 24 21:28:01.735345 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.735322 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-747cff9957-t7znk" event={"ID":"20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d","Type":"ContainerDied","Data":"f134489e1035118ab4f770860bb3522ae76ec031acea9609103e591abde82b44"} Apr 24 21:28:01.735870 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.735374 2569 scope.go:117] "RemoveContainer" containerID="e1110a5ff699d230a5fa91a09a13e67e86b2a36459ef3aceef10b4a7b06b50b7" Apr 24 21:28:01.737600 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.737506 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ccddbfc8-gq86x" event={"ID":"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a","Type":"ContainerStarted","Data":"4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03"} Apr 24 21:28:01.763348 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.763311 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-747cff9957-t7znk"] Apr 24 21:28:01.768511 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.768473 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-747cff9957-t7znk"] Apr 24 21:28:01.790308 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:01.790247 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66ccddbfc8-gq86x" podStartSLOduration=1.263637777 podStartE2EDuration="4.7902281s" podCreationTimestamp="2026-04-24 21:27:57 +0000 UTC" firstStartedPulling="2026-04-24 21:27:57.923136991 +0000 UTC m=+89.163490232" lastFinishedPulling="2026-04-24 21:28:01.449727303 +0000 UTC m=+92.690080555" observedRunningTime="2026-04-24 21:28:01.788483125 +0000 UTC m=+93.028836412" watchObservedRunningTime="2026-04-24 21:28:01.7902281 +0000 UTC m=+93.030581360" Apr 24 21:28:03.332149 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:03.332117 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" path="/var/lib/kubelet/pods/20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d/volumes" Apr 24 21:28:07.744970 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:07.744938 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:28:07.745415 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:07.744980 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:28:07.749951 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:07.749926 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:28:07.759514 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:07.759489 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:28:44.842836 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.842797 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bcb5b4bcd-mb7jq"] Apr 24 21:28:44.843288 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.843056 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" containerName="registry" Apr 24 21:28:44.843288 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.843068 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" containerName="registry" Apr 24 21:28:44.843288 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.843111 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="20c7bcc1-ce93-4ec1-9668-3ce24cd22b6d" containerName="registry" Apr 24 21:28:44.896246 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.896208 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bcb5b4bcd-mb7jq"] Apr 24 21:28:44.896406 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.896356 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:44.908893 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.908863 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-service-ca\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:44.909044 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.908902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-trusted-ca-bundle\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:44.909044 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.908921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-console-config\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:44.909044 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.908943 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-serving-cert\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:44.909044 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.908992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6x5\" (UniqueName: \"kubernetes.io/projected/b46ba90c-a614-444e-a716-7cd74dd392fd-kube-api-access-pd6x5\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:44.909044 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.909007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-oauth-config\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:44.909044 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:44.909022 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-oauth-serving-cert\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.009361 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.009319 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-serving-cert\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.009585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.009400 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6x5\" (UniqueName: \"kubernetes.io/projected/b46ba90c-a614-444e-a716-7cd74dd392fd-kube-api-access-pd6x5\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.009585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.009429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-oauth-config\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.009585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.009451 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-oauth-serving-cert\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.009585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.009495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-service-ca\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.009837 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.009615 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-trusted-ca-bundle\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.009837 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.009700 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-console-config\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.010274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.010242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-oauth-serving-cert\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.010447 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.010422 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-console-config\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.010571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.010555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-trusted-ca-bundle\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.010611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.010554 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-service-ca\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.012035 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.012001 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-serving-cert\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.012131 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.012067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-oauth-config\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.018811 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.018790 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6x5\" (UniqueName: \"kubernetes.io/projected/b46ba90c-a614-444e-a716-7cd74dd392fd-kube-api-access-pd6x5\") pod \"console-bcb5b4bcd-mb7jq\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.205248 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.205139 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:45.336790 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.333338 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bcb5b4bcd-mb7jq"] Apr 24 21:28:45.857062 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.857027 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcb5b4bcd-mb7jq" event={"ID":"b46ba90c-a614-444e-a716-7cd74dd392fd","Type":"ContainerStarted","Data":"46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936"} Apr 24 21:28:45.857062 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.857066 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcb5b4bcd-mb7jq" event={"ID":"b46ba90c-a614-444e-a716-7cd74dd392fd","Type":"ContainerStarted","Data":"a54d429605577f441b725cd9e938b72aa95f2dd2c9465914113385f836880c68"} Apr 24 21:28:45.877434 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:45.877373 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bcb5b4bcd-mb7jq" podStartSLOduration=1.877357932 podStartE2EDuration="1.877357932s" podCreationTimestamp="2026-04-24 21:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:45.875977277 +0000 UTC m=+137.116330538" watchObservedRunningTime="2026-04-24 21:28:45.877357932 +0000 UTC m=+137.117711193" Apr 24 21:28:55.206176 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:55.206131 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:55.206176 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:55.206189 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:55.211007 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:55.210985 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:55.888954 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:55.888928 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:28:55.949832 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:28:55.949798 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66ccddbfc8-gq86x"] Apr 24 21:29:20.969838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:20.969735 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66ccddbfc8-gq86x" podUID="b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" containerName="console" containerID="cri-o://4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03" gracePeriod=15 Apr 24 21:29:21.201535 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.201514 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66ccddbfc8-gq86x_b9f529f8-d9b9-4c93-8d7a-fc770c64e91a/console/0.log" Apr 24 21:29:21.201641 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.201571 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:29:21.263557 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.263530 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-oauth-serving-cert\") pod \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " Apr 24 21:29:21.263712 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.263597 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-oauth-config\") pod \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " Apr 24 21:29:21.263712 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.263617 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-serving-cert\") pod \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " Apr 24 21:29:21.263812 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.263790 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-service-ca\") pod \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " Apr 24 21:29:21.263874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.263857 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thdgg\" (UniqueName: \"kubernetes.io/projected/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-kube-api-access-thdgg\") pod \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " Apr 24 21:29:21.263931 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.263921 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-config\") pod \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " Apr 24 21:29:21.263982 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.263932 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" (UID: "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:21.263982 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.263969 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-trusted-ca-bundle\") pod \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\" (UID: \"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a\") " Apr 24 21:29:21.264157 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.264142 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-oauth-serving-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.264258 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.264147 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-service-ca" (OuterVolumeSpecName: "service-ca") pod "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" (UID: "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:21.264338 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.264296 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-config" (OuterVolumeSpecName: "console-config") pod "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" (UID: "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:21.264618 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.264547 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" (UID: "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:21.266010 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.265985 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" (UID: "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:21.266100 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.266034 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" (UID: "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:21.266145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.266096 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-kube-api-access-thdgg" (OuterVolumeSpecName: "kube-api-access-thdgg") pod "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" (UID: "b9f529f8-d9b9-4c93-8d7a-fc770c64e91a"). InnerVolumeSpecName "kube-api-access-thdgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:21.365419 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.365394 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thdgg\" (UniqueName: \"kubernetes.io/projected/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-kube-api-access-thdgg\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.365419 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.365418 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-config\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.365558 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.365430 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-trusted-ca-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.365558 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.365439 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-oauth-config\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.365558 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.365449 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-console-serving-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.365558 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.365458 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a-service-ca\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.950724 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.950699 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66ccddbfc8-gq86x_b9f529f8-d9b9-4c93-8d7a-fc770c64e91a/console/0.log" Apr 24 21:29:21.950897 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.950742 2569 generic.go:358] "Generic (PLEG): container finished" podID="b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" containerID="4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03" exitCode=2 Apr 24 21:29:21.950897 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.950809 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ccddbfc8-gq86x" Apr 24 21:29:21.950897 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.950827 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ccddbfc8-gq86x" event={"ID":"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a","Type":"ContainerDied","Data":"4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03"} Apr 24 21:29:21.950897 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.950863 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ccddbfc8-gq86x" event={"ID":"b9f529f8-d9b9-4c93-8d7a-fc770c64e91a","Type":"ContainerDied","Data":"a174117fb4807d2b771f8d51fe66e4ff26e4bd38bfb0dc0c41977eeae41729d7"} Apr 24 21:29:21.950897 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.950878 2569 scope.go:117] "RemoveContainer" containerID="4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03" Apr 24 21:29:21.958308 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.958292 2569 scope.go:117] "RemoveContainer" containerID="4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03" Apr 24 21:29:21.958566 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:29:21.958547 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03\": container with ID starting with 4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03 not found: ID does not exist" containerID="4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03" Apr 24 21:29:21.958613 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.958579 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03"} err="failed to get container status \"4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03\": rpc error: code = NotFound desc = could not find container \"4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03\": container with ID starting with 4e0fa9e272abe2f5513c0a86b33c6ec7517465d2b842aa441ce072d12e0abc03 not found: ID does not exist" Apr 24 21:29:21.991283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:21.991260 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66ccddbfc8-gq86x"] Apr 24 21:29:22.008886 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:22.008855 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66ccddbfc8-gq86x"] Apr 24 21:29:23.331868 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:29:23.331827 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" path="/var/lib/kubelet/pods/b9f529f8-d9b9-4c93-8d7a-fc770c64e91a/volumes" Apr 24 21:30:24.791904 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.791867 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67fbd788db-nrgbl"] Apr 24 21:30:24.792343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.792137 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" containerName="console" Apr 24 21:30:24.792343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.792154 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" containerName="console" Apr 24 21:30:24.792343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.792198 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9f529f8-d9b9-4c93-8d7a-fc770c64e91a" containerName="console" Apr 24 21:30:24.794887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.794870 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:24.808582 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.808553 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67fbd788db-nrgbl"] Apr 24 21:30:24.924007 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.923970 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkmgc\" (UniqueName: \"kubernetes.io/projected/738082d9-1cd9-449c-a217-e889775afdaa-kube-api-access-xkmgc\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:24.924185 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.924024 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-oauth-config\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:24.924185 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.924054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-oauth-serving-cert\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:24.924185 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.924074 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-console-config\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:24.924185 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.924098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-service-ca\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:24.924337 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.924196 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-trusted-ca-bundle\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:24.924337 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:24.924230 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-serving-cert\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.025363 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.025318 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-trusted-ca-bundle\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.025363 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.025366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-serving-cert\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.025610 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.025400 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkmgc\" (UniqueName: \"kubernetes.io/projected/738082d9-1cd9-449c-a217-e889775afdaa-kube-api-access-xkmgc\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.025610 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.025433 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-oauth-config\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.025610 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.025450 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-oauth-serving-cert\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.025610 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.025480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-console-config\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.025610 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.025516 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-service-ca\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.026176 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.026151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-oauth-serving-cert\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.026296 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.026247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-service-ca\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.026296 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.026276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-trusted-ca-bundle\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.026370 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.026315 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-console-config\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.028582 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.028556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-serving-cert\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.028717 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.028697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-oauth-config\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.035384 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.035364 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkmgc\" (UniqueName: \"kubernetes.io/projected/738082d9-1cd9-449c-a217-e889775afdaa-kube-api-access-xkmgc\") pod \"console-67fbd788db-nrgbl\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.103362 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.103279 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:25.236057 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:25.236030 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67fbd788db-nrgbl"] Apr 24 21:30:25.238030 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:30:25.237997 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod738082d9_1cd9_449c_a217_e889775afdaa.slice/crio-348b6ac4fd471ecbcaf35802ff507419630cbaf39ced3e7a156f0b835c71b695 WatchSource:0}: Error finding container 348b6ac4fd471ecbcaf35802ff507419630cbaf39ced3e7a156f0b835c71b695: Status 404 returned error can't find the container with id 348b6ac4fd471ecbcaf35802ff507419630cbaf39ced3e7a156f0b835c71b695 Apr 24 21:30:26.123955 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:26.123919 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67fbd788db-nrgbl" event={"ID":"738082d9-1cd9-449c-a217-e889775afdaa","Type":"ContainerStarted","Data":"d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33"} Apr 24 21:30:26.123955 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:26.123955 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67fbd788db-nrgbl" event={"ID":"738082d9-1cd9-449c-a217-e889775afdaa","Type":"ContainerStarted","Data":"348b6ac4fd471ecbcaf35802ff507419630cbaf39ced3e7a156f0b835c71b695"} Apr 24 21:30:26.145118 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:26.145067 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67fbd788db-nrgbl" podStartSLOduration=2.145053026 podStartE2EDuration="2.145053026s" podCreationTimestamp="2026-04-24 21:30:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:26.144082264 +0000 UTC m=+237.384435537" watchObservedRunningTime="2026-04-24 21:30:26.145053026 +0000 UTC m=+237.385406284" Apr 24 21:30:35.104173 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:35.104121 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:35.104173 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:35.104177 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:35.109399 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:35.109375 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:35.151320 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:35.151292 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:30:35.208405 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:35.208367 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bcb5b4bcd-mb7jq"] Apr 24 21:30:45.354292 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.354257 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c"] Apr 24 21:30:45.357595 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.357576 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.360398 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.360374 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:30:45.361685 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.361652 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:30:45.361753 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.361708 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-w8zch\"" Apr 24 21:30:45.367090 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.367066 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c"] Apr 24 21:30:45.387440 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.387399 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.387597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.387448 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8sp\" (UniqueName: \"kubernetes.io/projected/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-kube-api-access-zp8sp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.387597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.387500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.487898 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.487848 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.487898 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.487901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8sp\" (UniqueName: \"kubernetes.io/projected/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-kube-api-access-zp8sp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.488055 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.487925 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.488304 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.488280 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.488347 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.488333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.497434 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.497404 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8sp\" (UniqueName: \"kubernetes.io/projected/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-kube-api-access-zp8sp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.667712 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.667603 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:30:45.793574 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:45.793402 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c"] Apr 24 21:30:45.796408 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:30:45.796376 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87ae2d66_6fbf_4d59_a28f_25a89bc1e0a1.slice/crio-8269b1a2e4e4bfff0ae9cefc83e5890bb3782346bb108d3be18ecc9c1b3f7166 WatchSource:0}: Error finding container 8269b1a2e4e4bfff0ae9cefc83e5890bb3782346bb108d3be18ecc9c1b3f7166: Status 404 returned error can't find the container with id 8269b1a2e4e4bfff0ae9cefc83e5890bb3782346bb108d3be18ecc9c1b3f7166 Apr 24 21:30:46.177509 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:46.177474 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" event={"ID":"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1","Type":"ContainerStarted","Data":"8269b1a2e4e4bfff0ae9cefc83e5890bb3782346bb108d3be18ecc9c1b3f7166"} Apr 24 21:30:52.197540 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:52.197505 2569 generic.go:358] "Generic (PLEG): container finished" podID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerID="c2031f5464b031b7b88c6c1793d35831033cd03dfb9bb9fc2e16a1c308d10ce6" exitCode=0 Apr 24 21:30:52.197540 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:52.197544 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" event={"ID":"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1","Type":"ContainerDied","Data":"c2031f5464b031b7b88c6c1793d35831033cd03dfb9bb9fc2e16a1c308d10ce6"} Apr 24 21:30:55.207964 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:55.207928 2569 generic.go:358] "Generic (PLEG): container finished" podID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerID="9fe599d62c1c61c540d75dab07e8fc101aeeb95591764b793749650821e05849" exitCode=0 Apr 24 21:30:55.207964 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:30:55.207967 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" event={"ID":"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1","Type":"ContainerDied","Data":"9fe599d62c1c61c540d75dab07e8fc101aeeb95591764b793749650821e05849"} Apr 24 21:31:00.229090 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:00.229015 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bcb5b4bcd-mb7jq" podUID="b46ba90c-a614-444e-a716-7cd74dd392fd" containerName="console" containerID="cri-o://46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936" gracePeriod=15 Apr 24 21:31:02.393809 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.393787 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bcb5b4bcd-mb7jq_b46ba90c-a614-444e-a716-7cd74dd392fd/console/0.log" Apr 24 21:31:02.394126 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.393849 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:31:02.412404 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412347 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-oauth-config\") pod \"b46ba90c-a614-444e-a716-7cd74dd392fd\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " Apr 24 21:31:02.412404 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412413 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-service-ca\") pod \"b46ba90c-a614-444e-a716-7cd74dd392fd\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " Apr 24 21:31:02.412652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412446 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-serving-cert\") pod \"b46ba90c-a614-444e-a716-7cd74dd392fd\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " Apr 24 21:31:02.412652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412496 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-oauth-serving-cert\") pod \"b46ba90c-a614-444e-a716-7cd74dd392fd\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " Apr 24 21:31:02.412652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412550 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-trusted-ca-bundle\") pod \"b46ba90c-a614-444e-a716-7cd74dd392fd\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " Apr 24 21:31:02.412652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412576 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6x5\" (UniqueName: \"kubernetes.io/projected/b46ba90c-a614-444e-a716-7cd74dd392fd-kube-api-access-pd6x5\") pod \"b46ba90c-a614-444e-a716-7cd74dd392fd\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " Apr 24 21:31:02.412652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412610 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-console-config\") pod \"b46ba90c-a614-444e-a716-7cd74dd392fd\" (UID: \"b46ba90c-a614-444e-a716-7cd74dd392fd\") " Apr 24 21:31:02.412926 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412864 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-service-ca" (OuterVolumeSpecName: "service-ca") pod "b46ba90c-a614-444e-a716-7cd74dd392fd" (UID: "b46ba90c-a614-444e-a716-7cd74dd392fd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.413033 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.412999 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b46ba90c-a614-444e-a716-7cd74dd392fd" (UID: "b46ba90c-a614-444e-a716-7cd74dd392fd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.413266 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.413238 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b46ba90c-a614-444e-a716-7cd74dd392fd" (UID: "b46ba90c-a614-444e-a716-7cd74dd392fd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.413450 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.413420 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-console-config" (OuterVolumeSpecName: "console-config") pod "b46ba90c-a614-444e-a716-7cd74dd392fd" (UID: "b46ba90c-a614-444e-a716-7cd74dd392fd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:02.414932 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.414904 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b46ba90c-a614-444e-a716-7cd74dd392fd" (UID: "b46ba90c-a614-444e-a716-7cd74dd392fd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.415227 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.414929 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b46ba90c-a614-444e-a716-7cd74dd392fd" (UID: "b46ba90c-a614-444e-a716-7cd74dd392fd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:02.415316 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.415298 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46ba90c-a614-444e-a716-7cd74dd392fd-kube-api-access-pd6x5" (OuterVolumeSpecName: "kube-api-access-pd6x5") pod "b46ba90c-a614-444e-a716-7cd74dd392fd" (UID: "b46ba90c-a614-444e-a716-7cd74dd392fd"). InnerVolumeSpecName "kube-api-access-pd6x5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:02.514184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.514099 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-oauth-serving-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.514184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.514132 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-trusted-ca-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.514184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.514142 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pd6x5\" (UniqueName: \"kubernetes.io/projected/b46ba90c-a614-444e-a716-7cd74dd392fd-kube-api-access-pd6x5\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.514184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.514151 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-console-config\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.514184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.514161 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-oauth-config\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.514184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.514170 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b46ba90c-a614-444e-a716-7cd74dd392fd-service-ca\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:02.514184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:02.514178 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b46ba90c-a614-444e-a716-7cd74dd392fd-console-serving-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:03.234491 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.234463 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bcb5b4bcd-mb7jq_b46ba90c-a614-444e-a716-7cd74dd392fd/console/0.log" Apr 24 21:31:03.234657 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.234505 2569 generic.go:358] "Generic (PLEG): container finished" podID="b46ba90c-a614-444e-a716-7cd74dd392fd" containerID="46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936" exitCode=2 Apr 24 21:31:03.234657 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.234567 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcb5b4bcd-mb7jq" event={"ID":"b46ba90c-a614-444e-a716-7cd74dd392fd","Type":"ContainerDied","Data":"46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936"} Apr 24 21:31:03.234657 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.234572 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bcb5b4bcd-mb7jq" Apr 24 21:31:03.234657 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.234601 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bcb5b4bcd-mb7jq" event={"ID":"b46ba90c-a614-444e-a716-7cd74dd392fd","Type":"ContainerDied","Data":"a54d429605577f441b725cd9e938b72aa95f2dd2c9465914113385f836880c68"} Apr 24 21:31:03.234657 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.234620 2569 scope.go:117] "RemoveContainer" containerID="46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936" Apr 24 21:31:03.236588 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.236566 2569 generic.go:358] "Generic (PLEG): container finished" podID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerID="4f8be89498f3dc268e5257497b2251bcd25364ecf5c24fdf43ccf749084bb207" exitCode=0 Apr 24 21:31:03.236762 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.236621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" event={"ID":"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1","Type":"ContainerDied","Data":"4f8be89498f3dc268e5257497b2251bcd25364ecf5c24fdf43ccf749084bb207"} Apr 24 21:31:03.242520 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.242494 2569 scope.go:117] "RemoveContainer" containerID="46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936" Apr 24 21:31:03.242810 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:03.242781 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936\": container with ID starting with 46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936 not found: ID does not exist" containerID="46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936" Apr 24 21:31:03.242875 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.242823 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936"} err="failed to get container status \"46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936\": rpc error: code = NotFound desc = could not find container \"46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936\": container with ID starting with 46d1524344d7f3cd635d44dcc0ece9acac15b5cdb5cbd7762d3551fb21c77936 not found: ID does not exist" Apr 24 21:31:03.275192 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.275152 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bcb5b4bcd-mb7jq"] Apr 24 21:31:03.279210 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.279181 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bcb5b4bcd-mb7jq"] Apr 24 21:31:03.332085 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:03.332053 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46ba90c-a614-444e-a716-7cd74dd392fd" path="/var/lib/kubelet/pods/b46ba90c-a614-444e-a716-7cd74dd392fd/volumes" Apr 24 21:31:04.360221 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.360196 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:31:04.426129 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.426096 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp8sp\" (UniqueName: \"kubernetes.io/projected/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-kube-api-access-zp8sp\") pod \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " Apr 24 21:31:04.426129 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.426138 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-bundle\") pod \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " Apr 24 21:31:04.426333 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.426168 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-util\") pod \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\" (UID: \"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1\") " Apr 24 21:31:04.426871 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.426845 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-bundle" (OuterVolumeSpecName: "bundle") pod "87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" (UID: "87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:04.428451 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.428425 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-kube-api-access-zp8sp" (OuterVolumeSpecName: "kube-api-access-zp8sp") pod "87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" (UID: "87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1"). InnerVolumeSpecName "kube-api-access-zp8sp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:04.430903 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.430882 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-util" (OuterVolumeSpecName: "util") pod "87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" (UID: "87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:04.527133 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.527095 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zp8sp\" (UniqueName: \"kubernetes.io/projected/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-kube-api-access-zp8sp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:04.527133 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.527126 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:04.527133 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:04.527137 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:31:05.244719 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:05.244660 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" event={"ID":"87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1","Type":"ContainerDied","Data":"8269b1a2e4e4bfff0ae9cefc83e5890bb3782346bb108d3be18ecc9c1b3f7166"} Apr 24 21:31:05.244719 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:05.244711 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl872c" Apr 24 21:31:05.244719 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:05.244723 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8269b1a2e4e4bfff0ae9cefc83e5890bb3782346bb108d3be18ecc9c1b3f7166" Apr 24 21:31:07.349925 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.349884 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4"] Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350180 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerName="util" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350192 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerName="util" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350206 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerName="extract" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350211 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerName="extract" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350224 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b46ba90c-a614-444e-a716-7cd74dd392fd" containerName="console" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350233 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46ba90c-a614-444e-a716-7cd74dd392fd" containerName="console" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350247 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerName="pull" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350252 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerName="pull" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350294 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b46ba90c-a614-444e-a716-7cd74dd392fd" containerName="console" Apr 24 21:31:07.350487 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.350302 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="87ae2d66-6fbf-4d59-a28f-25a89bc1e0a1" containerName="extract" Apr 24 21:31:07.395992 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.395950 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4"] Apr 24 21:31:07.396146 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.396083 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:07.402654 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.402473 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:31:07.402654 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.402487 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-w9qk4\"" Apr 24 21:31:07.402821 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.402662 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:31:07.402821 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.402702 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:31:07.449311 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.449274 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4\" (UID: \"515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:07.449483 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.449355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgcdt\" (UniqueName: \"kubernetes.io/projected/515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2-kube-api-access-bgcdt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4\" (UID: \"515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:07.549935 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.549896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4\" (UID: \"515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:07.550120 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.549968 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgcdt\" (UniqueName: \"kubernetes.io/projected/515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2-kube-api-access-bgcdt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4\" (UID: \"515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:07.552475 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.552451 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4\" (UID: \"515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:07.559597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.559574 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgcdt\" (UniqueName: \"kubernetes.io/projected/515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2-kube-api-access-bgcdt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4\" (UID: \"515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:07.706589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.706498 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:07.839883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:07.839846 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4"] Apr 24 21:31:07.843550 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:31:07.843509 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod515eb3fc_b1dd_4fdd_a1cb_7f2db8b01ec2.slice/crio-2ede24b00a2eb5d69e50d22b0d2c1627934db34edfd7e192f174671d6fa1d202 WatchSource:0}: Error finding container 2ede24b00a2eb5d69e50d22b0d2c1627934db34edfd7e192f174671d6fa1d202: Status 404 returned error can't find the container with id 2ede24b00a2eb5d69e50d22b0d2c1627934db34edfd7e192f174671d6fa1d202 Apr 24 21:31:08.253594 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:08.253560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" event={"ID":"515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2","Type":"ContainerStarted","Data":"2ede24b00a2eb5d69e50d22b0d2c1627934db34edfd7e192f174671d6fa1d202"} Apr 24 21:31:12.267408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.267366 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" event={"ID":"515eb3fc-b1dd-4fdd-a1cb-7f2db8b01ec2","Type":"ContainerStarted","Data":"03bb46194adf503f5da1ad09da3976b8c56b91e26601d15435eb4e7abaccd9ea"} Apr 24 21:31:12.267793 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.267484 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:12.315643 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.315586 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" podStartSLOduration=1.5230606880000002 podStartE2EDuration="5.315573056s" podCreationTimestamp="2026-04-24 21:31:07 +0000 UTC" firstStartedPulling="2026-04-24 21:31:07.845277003 +0000 UTC m=+279.085630241" lastFinishedPulling="2026-04-24 21:31:11.637789367 +0000 UTC m=+282.878142609" observedRunningTime="2026-04-24 21:31:12.311370485 +0000 UTC m=+283.551723745" watchObservedRunningTime="2026-04-24 21:31:12.315573056 +0000 UTC m=+283.555926321" Apr 24 21:31:12.339190 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.339152 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qlfjw"] Apr 24 21:31:12.359894 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.359858 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.363949 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.363923 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-n2vl9\"" Apr 24 21:31:12.364938 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.364920 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:31:12.365494 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.365479 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:31:12.370620 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.370598 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qlfjw"] Apr 24 21:31:12.389866 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.389829 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.389866 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.389865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqdmf\" (UniqueName: \"kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-kube-api-access-sqdmf\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.390099 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.389968 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/505d8183-2459-4908-830e-1b016cd5e4fc-cabundle0\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.491107 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.491061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/505d8183-2459-4908-830e-1b016cd5e4fc-cabundle0\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.491107 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.491106 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.491322 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.491153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqdmf\" (UniqueName: \"kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-kube-api-access-sqdmf\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.491322 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.491230 2569 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 21:31:12.491322 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.491253 2569 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:31:12.491322 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.491260 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:31:12.491322 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.491274 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qlfjw: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:31:12.491519 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.491325 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates podName:505d8183-2459-4908-830e-1b016cd5e4fc nodeName:}" failed. No retries permitted until 2026-04-24 21:31:12.9913107 +0000 UTC m=+284.231663939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates") pod "keda-operator-ffbb595cb-qlfjw" (UID: "505d8183-2459-4908-830e-1b016cd5e4fc") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:31:12.491810 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.491792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/505d8183-2459-4908-830e-1b016cd5e4fc-cabundle0\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.500644 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.500608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqdmf\" (UniqueName: \"kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-kube-api-access-sqdmf\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.679094 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.679020 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck"] Apr 24 21:31:12.682264 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.682248 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.685119 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.685093 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:31:12.698911 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.698873 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck"] Apr 24 21:31:12.793852 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.793810 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.793852 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.793852 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqmnm\" (UniqueName: \"kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-kube-api-access-vqmnm\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.794107 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.793933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fb72e419-2ae2-4693-b7d5-77e666c8fde3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.895227 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.895191 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fb72e419-2ae2-4693-b7d5-77e666c8fde3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.895402 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.895247 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.895402 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.895268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqmnm\" (UniqueName: \"kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-kube-api-access-vqmnm\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.895520 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.895412 2569 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:31:12.895520 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.895435 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:31:12.895520 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.895460 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck: references non-existent secret key: tls.crt Apr 24 21:31:12.895637 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.895550 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-certificates podName:fb72e419-2ae2-4693-b7d5-77e666c8fde3 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:13.395516472 +0000 UTC m=+284.635869726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-certificates") pod "keda-metrics-apiserver-7c9f485588-8s6ck" (UID: "fb72e419-2ae2-4693-b7d5-77e666c8fde3") : references non-existent secret key: tls.crt Apr 24 21:31:12.895637 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.895572 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fb72e419-2ae2-4693-b7d5-77e666c8fde3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.946537 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.946464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqmnm\" (UniqueName: \"kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-kube-api-access-vqmnm\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:12.996538 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:12.996501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:12.996733 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.996649 2569 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:31:12.996733 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.996685 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:31:12.996733 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.996698 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qlfjw: references non-existent secret key: ca.crt Apr 24 21:31:12.996886 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:31:12.996754 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates podName:505d8183-2459-4908-830e-1b016cd5e4fc nodeName:}" failed. No retries permitted until 2026-04-24 21:31:13.996739577 +0000 UTC m=+285.237092816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates") pod "keda-operator-ffbb595cb-qlfjw" (UID: "505d8183-2459-4908-830e-1b016cd5e4fc") : references non-existent secret key: ca.crt Apr 24 21:31:13.400009 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:13.399957 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:13.402615 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:13.402590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fb72e419-2ae2-4693-b7d5-77e666c8fde3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8s6ck\" (UID: \"fb72e419-2ae2-4693-b7d5-77e666c8fde3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:13.596088 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:13.596042 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:13.729267 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:13.729243 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck"] Apr 24 21:31:13.731289 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:31:13.731258 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb72e419_2ae2_4693_b7d5_77e666c8fde3.slice/crio-439686b4dd742bcc6bb11a1afaa8cccdb65a3bb7e987efde5c7fb865a1531a7a WatchSource:0}: Error finding container 439686b4dd742bcc6bb11a1afaa8cccdb65a3bb7e987efde5c7fb865a1531a7a: Status 404 returned error can't find the container with id 439686b4dd742bcc6bb11a1afaa8cccdb65a3bb7e987efde5c7fb865a1531a7a Apr 24 21:31:14.006018 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:14.005986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:14.008563 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:14.008539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/505d8183-2459-4908-830e-1b016cd5e4fc-certificates\") pod \"keda-operator-ffbb595cb-qlfjw\" (UID: \"505d8183-2459-4908-830e-1b016cd5e4fc\") " pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:14.173047 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:14.172987 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:14.275164 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:14.275081 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" event={"ID":"fb72e419-2ae2-4693-b7d5-77e666c8fde3","Type":"ContainerStarted","Data":"439686b4dd742bcc6bb11a1afaa8cccdb65a3bb7e987efde5c7fb865a1531a7a"} Apr 24 21:31:14.304319 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:14.304195 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qlfjw"] Apr 24 21:31:14.306344 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:31:14.306317 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod505d8183_2459_4908_830e_1b016cd5e4fc.slice/crio-d73a23ff56f163b75c725ab8aa063f317a2560b4daa1dc43c91f69e006ff2ea5 WatchSource:0}: Error finding container d73a23ff56f163b75c725ab8aa063f317a2560b4daa1dc43c91f69e006ff2ea5: Status 404 returned error can't find the container with id d73a23ff56f163b75c725ab8aa063f317a2560b4daa1dc43c91f69e006ff2ea5 Apr 24 21:31:15.280571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:15.280528 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" event={"ID":"505d8183-2459-4908-830e-1b016cd5e4fc","Type":"ContainerStarted","Data":"d73a23ff56f163b75c725ab8aa063f317a2560b4daa1dc43c91f69e006ff2ea5"} Apr 24 21:31:18.293269 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:18.293228 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" event={"ID":"505d8183-2459-4908-830e-1b016cd5e4fc","Type":"ContainerStarted","Data":"8b69b88fa46031b8cefbfe6f05d55a0c8c7ed186bfbb5ebdd4a9ca4b05eac303"} Apr 24 21:31:18.293745 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:18.293314 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:31:18.294555 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:18.294520 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" event={"ID":"fb72e419-2ae2-4693-b7d5-77e666c8fde3","Type":"ContainerStarted","Data":"780f21f8434f0b4ac9789b013b9e274c8fc089b077c8651f9c336f6c4dc2ff5a"} Apr 24 21:31:18.294719 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:18.294702 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:18.310317 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:18.310268 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" podStartSLOduration=3.082353931 podStartE2EDuration="6.310255583s" podCreationTimestamp="2026-04-24 21:31:12 +0000 UTC" firstStartedPulling="2026-04-24 21:31:14.307729088 +0000 UTC m=+285.548082327" lastFinishedPulling="2026-04-24 21:31:17.53563074 +0000 UTC m=+288.775983979" observedRunningTime="2026-04-24 21:31:18.309854898 +0000 UTC m=+289.550208168" watchObservedRunningTime="2026-04-24 21:31:18.310255583 +0000 UTC m=+289.550608844" Apr 24 21:31:18.327421 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:18.327363 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" podStartSLOduration=2.5834244159999997 podStartE2EDuration="6.327350593s" podCreationTimestamp="2026-04-24 21:31:12 +0000 UTC" firstStartedPulling="2026-04-24 21:31:13.732602425 +0000 UTC m=+284.972955664" lastFinishedPulling="2026-04-24 21:31:17.476528599 +0000 UTC m=+288.716881841" observedRunningTime="2026-04-24 21:31:18.327098691 +0000 UTC m=+289.567451965" watchObservedRunningTime="2026-04-24 21:31:18.327350593 +0000 UTC m=+289.567703853" Apr 24 21:31:29.281598 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:29.281564 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:31:29.282034 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:29.281814 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:31:29.287316 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:29.287296 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:31:29.302597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:29.302576 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8s6ck" Apr 24 21:31:33.272756 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:33.272728 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dkfj4" Apr 24 21:31:39.300134 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:31:39.300107 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-qlfjw" Apr 24 21:32:06.232447 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.232408 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd"] Apr 24 21:32:06.241651 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.241633 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.244969 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.244949 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:32:06.245262 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.245247 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-w8zch\"" Apr 24 21:32:06.246358 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.246343 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:32:06.251592 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.251566 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd"] Apr 24 21:32:06.404848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.404807 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.405029 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.404872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vd7x\" (UniqueName: \"kubernetes.io/projected/4bf6a0ab-4efa-4ad0-896c-e279284930ff-kube-api-access-7vd7x\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.405029 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.404947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.505686 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.505644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vd7x\" (UniqueName: \"kubernetes.io/projected/4bf6a0ab-4efa-4ad0-896c-e279284930ff-kube-api-access-7vd7x\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.505848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.505697 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.505848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.505738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.506063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.506046 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.506137 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.506086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.514983 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.514959 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vd7x\" (UniqueName: \"kubernetes.io/projected/4bf6a0ab-4efa-4ad0-896c-e279284930ff-kube-api-access-7vd7x\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.550655 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.550632 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:06.676738 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.676709 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd"] Apr 24 21:32:06.679236 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:32:06.679206 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bf6a0ab_4efa_4ad0_896c_e279284930ff.slice/crio-059fd0b4ed9d9707b44cb1d72c0bbf6580c03ac2a2c47b8bf6f7421bd892e4a0 WatchSource:0}: Error finding container 059fd0b4ed9d9707b44cb1d72c0bbf6580c03ac2a2c47b8bf6f7421bd892e4a0: Status 404 returned error can't find the container with id 059fd0b4ed9d9707b44cb1d72c0bbf6580c03ac2a2c47b8bf6f7421bd892e4a0 Apr 24 21:32:06.681149 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:06.681130 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:32:07.444557 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:07.444518 2569 generic.go:358] "Generic (PLEG): container finished" podID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerID="98cc51d9ae441a0c212929b96ead4c3caf20d439f91ec0d5846cf81974b301ed" exitCode=0 Apr 24 21:32:07.444557 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:07.444559 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" event={"ID":"4bf6a0ab-4efa-4ad0-896c-e279284930ff","Type":"ContainerDied","Data":"98cc51d9ae441a0c212929b96ead4c3caf20d439f91ec0d5846cf81974b301ed"} Apr 24 21:32:07.445023 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:07.444581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" event={"ID":"4bf6a0ab-4efa-4ad0-896c-e279284930ff","Type":"ContainerStarted","Data":"059fd0b4ed9d9707b44cb1d72c0bbf6580c03ac2a2c47b8bf6f7421bd892e4a0"} Apr 24 21:32:10.455329 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:10.455292 2569 generic.go:358] "Generic (PLEG): container finished" podID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerID="83bd811bb847d1f0a4c5f902ba7f2c13c726f35760b9fabb2e6abf6e7600e375" exitCode=0 Apr 24 21:32:10.455804 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:10.455377 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" event={"ID":"4bf6a0ab-4efa-4ad0-896c-e279284930ff","Type":"ContainerDied","Data":"83bd811bb847d1f0a4c5f902ba7f2c13c726f35760b9fabb2e6abf6e7600e375"} Apr 24 21:32:11.460366 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:11.460337 2569 generic.go:358] "Generic (PLEG): container finished" podID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerID="70081d829e8a7c4034bd9fa9369b6a21dc1d39b192c0d64475002b747403d5e2" exitCode=0 Apr 24 21:32:11.460736 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:11.460394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" event={"ID":"4bf6a0ab-4efa-4ad0-896c-e279284930ff","Type":"ContainerDied","Data":"70081d829e8a7c4034bd9fa9369b6a21dc1d39b192c0d64475002b747403d5e2"} Apr 24 21:32:12.590130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.590102 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:12.753157 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.753132 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vd7x\" (UniqueName: \"kubernetes.io/projected/4bf6a0ab-4efa-4ad0-896c-e279284930ff-kube-api-access-7vd7x\") pod \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " Apr 24 21:32:12.753310 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.753171 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-util\") pod \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " Apr 24 21:32:12.753310 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.753211 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-bundle\") pod \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\" (UID: \"4bf6a0ab-4efa-4ad0-896c-e279284930ff\") " Apr 24 21:32:12.753936 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.753913 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-bundle" (OuterVolumeSpecName: "bundle") pod "4bf6a0ab-4efa-4ad0-896c-e279284930ff" (UID: "4bf6a0ab-4efa-4ad0-896c-e279284930ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:12.755418 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.755396 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bf6a0ab-4efa-4ad0-896c-e279284930ff-kube-api-access-7vd7x" (OuterVolumeSpecName: "kube-api-access-7vd7x") pod "4bf6a0ab-4efa-4ad0-896c-e279284930ff" (UID: "4bf6a0ab-4efa-4ad0-896c-e279284930ff"). InnerVolumeSpecName "kube-api-access-7vd7x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:12.759208 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.759180 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-util" (OuterVolumeSpecName: "util") pod "4bf6a0ab-4efa-4ad0-896c-e279284930ff" (UID: "4bf6a0ab-4efa-4ad0-896c-e279284930ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:12.853840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.853801 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vd7x\" (UniqueName: \"kubernetes.io/projected/4bf6a0ab-4efa-4ad0-896c-e279284930ff-kube-api-access-7vd7x\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.853840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.853832 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:12.853840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:12.853842 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bf6a0ab-4efa-4ad0-896c-e279284930ff-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:13.468252 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:13.468216 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" event={"ID":"4bf6a0ab-4efa-4ad0-896c-e279284930ff","Type":"ContainerDied","Data":"059fd0b4ed9d9707b44cb1d72c0bbf6580c03ac2a2c47b8bf6f7421bd892e4a0"} Apr 24 21:32:13.468252 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:13.468251 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="059fd0b4ed9d9707b44cb1d72c0bbf6580c03ac2a2c47b8bf6f7421bd892e4a0" Apr 24 21:32:13.468441 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:13.468275 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dt7whd" Apr 24 21:32:19.368573 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.368491 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj"] Apr 24 21:32:19.368948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.368813 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerName="extract" Apr 24 21:32:19.368948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.368827 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerName="extract" Apr 24 21:32:19.368948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.368845 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerName="util" Apr 24 21:32:19.368948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.368850 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerName="util" Apr 24 21:32:19.368948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.368858 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerName="pull" Apr 24 21:32:19.368948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.368864 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerName="pull" Apr 24 21:32:19.368948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.368904 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bf6a0ab-4efa-4ad0-896c-e279284930ff" containerName="extract" Apr 24 21:32:19.371596 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.371579 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" Apr 24 21:32:19.376686 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.376643 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 24 21:32:19.376813 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.376714 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:32:19.376880 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.376823 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-wjfrd\"" Apr 24 21:32:19.390155 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.390128 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj"] Apr 24 21:32:19.398823 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.398799 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a35d7bf-7059-483b-a245-457980a6e456-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-zwmzj\" (UID: \"5a35d7bf-7059-483b-a245-457980a6e456\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" Apr 24 21:32:19.398940 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.398866 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzkfn\" (UniqueName: \"kubernetes.io/projected/5a35d7bf-7059-483b-a245-457980a6e456-kube-api-access-mzkfn\") pod \"cert-manager-operator-controller-manager-54b9655956-zwmzj\" (UID: \"5a35d7bf-7059-483b-a245-457980a6e456\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" Apr 24 21:32:19.499464 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.499431 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a35d7bf-7059-483b-a245-457980a6e456-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-zwmzj\" (UID: \"5a35d7bf-7059-483b-a245-457980a6e456\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" Apr 24 21:32:19.499628 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.499493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzkfn\" (UniqueName: \"kubernetes.io/projected/5a35d7bf-7059-483b-a245-457980a6e456-kube-api-access-mzkfn\") pod \"cert-manager-operator-controller-manager-54b9655956-zwmzj\" (UID: \"5a35d7bf-7059-483b-a245-457980a6e456\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" Apr 24 21:32:19.499921 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.499902 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a35d7bf-7059-483b-a245-457980a6e456-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-zwmzj\" (UID: \"5a35d7bf-7059-483b-a245-457980a6e456\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" Apr 24 21:32:19.517447 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.517415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzkfn\" (UniqueName: \"kubernetes.io/projected/5a35d7bf-7059-483b-a245-457980a6e456-kube-api-access-mzkfn\") pod \"cert-manager-operator-controller-manager-54b9655956-zwmzj\" (UID: \"5a35d7bf-7059-483b-a245-457980a6e456\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" Apr 24 21:32:19.680507 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.680414 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" Apr 24 21:32:19.822452 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:32:19.822420 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a35d7bf_7059_483b_a245_457980a6e456.slice/crio-beaf78cd518cedfa61345e1ff3da2c8c82f2f298aa22de158edc4657ae5e4da8 WatchSource:0}: Error finding container beaf78cd518cedfa61345e1ff3da2c8c82f2f298aa22de158edc4657ae5e4da8: Status 404 returned error can't find the container with id beaf78cd518cedfa61345e1ff3da2c8c82f2f298aa22de158edc4657ae5e4da8 Apr 24 21:32:19.824849 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:19.824826 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj"] Apr 24 21:32:20.492687 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:20.492631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" event={"ID":"5a35d7bf-7059-483b-a245-457980a6e456","Type":"ContainerStarted","Data":"beaf78cd518cedfa61345e1ff3da2c8c82f2f298aa22de158edc4657ae5e4da8"} Apr 24 21:32:21.498339 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:21.498295 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" event={"ID":"5a35d7bf-7059-483b-a245-457980a6e456","Type":"ContainerStarted","Data":"4f51bcb473989f5391bedbf7aaec661964683912217ad0d8caef98bcdc85b849"} Apr 24 21:32:21.529188 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:21.529133 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-zwmzj" podStartSLOduration=0.951775382 podStartE2EDuration="2.529119376s" podCreationTimestamp="2026-04-24 21:32:19 +0000 UTC" firstStartedPulling="2026-04-24 21:32:19.824928817 +0000 UTC m=+351.065282056" lastFinishedPulling="2026-04-24 21:32:21.402272798 +0000 UTC m=+352.642626050" observedRunningTime="2026-04-24 21:32:21.527658147 +0000 UTC m=+352.768011409" watchObservedRunningTime="2026-04-24 21:32:21.529119376 +0000 UTC m=+352.769472637" Apr 24 21:32:28.574120 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.574081 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h"] Apr 24 21:32:28.577590 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.577569 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.582357 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.582333 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:32:28.583647 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.583619 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-w8zch\"" Apr 24 21:32:28.583772 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.583646 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:32:28.597112 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.597073 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h"] Apr 24 21:32:28.670785 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.670745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf267\" (UniqueName: \"kubernetes.io/projected/ee871743-a575-441c-bd3c-980407b20966-kube-api-access-kf267\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.670943 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.670862 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.670943 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.670916 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.771422 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.771382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.771422 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.771425 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf267\" (UniqueName: \"kubernetes.io/projected/ee871743-a575-441c-bd3c-980407b20966-kube-api-access-kf267\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.771614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.771480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.771777 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.771756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.771819 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.771787 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.788250 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.788226 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf267\" (UniqueName: \"kubernetes.io/projected/ee871743-a575-441c-bd3c-980407b20966-kube-api-access-kf267\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:28.886532 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:28.886449 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:29.026554 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:29.026522 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h"] Apr 24 21:32:29.029807 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:32:29.029779 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee871743_a575_441c_bd3c_980407b20966.slice/crio-3ade53024d3f120ae0a4e5a1420cc1e494c157f26f9f6cc865eda1d6e7694e01 WatchSource:0}: Error finding container 3ade53024d3f120ae0a4e5a1420cc1e494c157f26f9f6cc865eda1d6e7694e01: Status 404 returned error can't find the container with id 3ade53024d3f120ae0a4e5a1420cc1e494c157f26f9f6cc865eda1d6e7694e01 Apr 24 21:32:29.526398 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:29.526359 2569 generic.go:358] "Generic (PLEG): container finished" podID="ee871743-a575-441c-bd3c-980407b20966" containerID="1a51805d95930302d5a6dd9f3dc1de3067efcd53e86a2a03800366c2ab667c95" exitCode=0 Apr 24 21:32:29.526398 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:29.526399 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" event={"ID":"ee871743-a575-441c-bd3c-980407b20966","Type":"ContainerDied","Data":"1a51805d95930302d5a6dd9f3dc1de3067efcd53e86a2a03800366c2ab667c95"} Apr 24 21:32:29.526662 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:29.526424 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" event={"ID":"ee871743-a575-441c-bd3c-980407b20966","Type":"ContainerStarted","Data":"3ade53024d3f120ae0a4e5a1420cc1e494c157f26f9f6cc865eda1d6e7694e01"} Apr 24 21:32:32.538795 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:32.538757 2569 generic.go:358] "Generic (PLEG): container finished" podID="ee871743-a575-441c-bd3c-980407b20966" containerID="9fd328619bd4a67c01744d25c28876d33d7a22507ce2e893068e5404928f39aa" exitCode=0 Apr 24 21:32:32.539170 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:32.538804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" event={"ID":"ee871743-a575-441c-bd3c-980407b20966","Type":"ContainerDied","Data":"9fd328619bd4a67c01744d25c28876d33d7a22507ce2e893068e5404928f39aa"} Apr 24 21:32:33.543446 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:33.543410 2569 generic.go:358] "Generic (PLEG): container finished" podID="ee871743-a575-441c-bd3c-980407b20966" containerID="2de9e4754b64d9b3a0189c5c7679d0d6c082defdde725d873d9607a5d2424a23" exitCode=0 Apr 24 21:32:33.543855 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:33.543499 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" event={"ID":"ee871743-a575-441c-bd3c-980407b20966","Type":"ContainerDied","Data":"2de9e4754b64d9b3a0189c5c7679d0d6c082defdde725d873d9607a5d2424a23"} Apr 24 21:32:34.666330 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.666308 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:34.717832 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.717799 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-bundle\") pod \"ee871743-a575-441c-bd3c-980407b20966\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " Apr 24 21:32:34.717983 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.717864 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf267\" (UniqueName: \"kubernetes.io/projected/ee871743-a575-441c-bd3c-980407b20966-kube-api-access-kf267\") pod \"ee871743-a575-441c-bd3c-980407b20966\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " Apr 24 21:32:34.717983 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.717971 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-util\") pod \"ee871743-a575-441c-bd3c-980407b20966\" (UID: \"ee871743-a575-441c-bd3c-980407b20966\") " Apr 24 21:32:34.718270 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.718230 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-bundle" (OuterVolumeSpecName: "bundle") pod "ee871743-a575-441c-bd3c-980407b20966" (UID: "ee871743-a575-441c-bd3c-980407b20966"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:34.720122 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.720098 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee871743-a575-441c-bd3c-980407b20966-kube-api-access-kf267" (OuterVolumeSpecName: "kube-api-access-kf267") pod "ee871743-a575-441c-bd3c-980407b20966" (UID: "ee871743-a575-441c-bd3c-980407b20966"). InnerVolumeSpecName "kube-api-access-kf267". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:34.723824 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.723794 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-util" (OuterVolumeSpecName: "util") pod "ee871743-a575-441c-bd3c-980407b20966" (UID: "ee871743-a575-441c-bd3c-980407b20966"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:34.818495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.818417 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kf267\" (UniqueName: \"kubernetes.io/projected/ee871743-a575-441c-bd3c-980407b20966-kube-api-access-kf267\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:34.818495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.818444 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:34.818495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:34.818454 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee871743-a575-441c-bd3c-980407b20966-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:35.551345 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:35.551309 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" event={"ID":"ee871743-a575-441c-bd3c-980407b20966","Type":"ContainerDied","Data":"3ade53024d3f120ae0a4e5a1420cc1e494c157f26f9f6cc865eda1d6e7694e01"} Apr 24 21:32:35.551345 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:35.551347 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ade53024d3f120ae0a4e5a1420cc1e494c157f26f9f6cc865eda1d6e7694e01" Apr 24 21:32:35.551551 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:35.551319 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc486h" Apr 24 21:32:54.600440 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.600399 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl"] Apr 24 21:32:54.600937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.600716 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee871743-a575-441c-bd3c-980407b20966" containerName="util" Apr 24 21:32:54.600937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.600730 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee871743-a575-441c-bd3c-980407b20966" containerName="util" Apr 24 21:32:54.600937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.600737 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee871743-a575-441c-bd3c-980407b20966" containerName="pull" Apr 24 21:32:54.600937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.600745 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee871743-a575-441c-bd3c-980407b20966" containerName="pull" Apr 24 21:32:54.600937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.600761 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee871743-a575-441c-bd3c-980407b20966" containerName="extract" Apr 24 21:32:54.600937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.600767 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee871743-a575-441c-bd3c-980407b20966" containerName="extract" Apr 24 21:32:54.600937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.600823 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee871743-a575-441c-bd3c-980407b20966" containerName="extract" Apr 24 21:32:54.603729 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.603711 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.607777 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.607750 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:32:54.609170 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.609144 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-w8zch\"" Apr 24 21:32:54.609294 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.609230 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:32:54.627894 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.627853 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl"] Apr 24 21:32:54.678914 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.678870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.679142 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.678933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kfzc\" (UniqueName: \"kubernetes.io/projected/94a95721-09a5-4ce0-a983-c80aba8d6412-kube-api-access-6kfzc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.679142 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.678986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.780107 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.780060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.780277 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.780123 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kfzc\" (UniqueName: \"kubernetes.io/projected/94a95721-09a5-4ce0-a983-c80aba8d6412-kube-api-access-6kfzc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.780277 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.780164 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.780560 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.780540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.780597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.780549 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.790850 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.790818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kfzc\" (UniqueName: \"kubernetes.io/projected/94a95721-09a5-4ce0-a983-c80aba8d6412-kube-api-access-6kfzc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:54.913478 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:54.913389 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:55.046850 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:55.046816 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl"] Apr 24 21:32:55.048268 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:32:55.048237 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a95721_09a5_4ce0_a983_c80aba8d6412.slice/crio-32749802d947e7200f26f86c7851f41d4af8c81d28a793677076440b7565a09d WatchSource:0}: Error finding container 32749802d947e7200f26f86c7851f41d4af8c81d28a793677076440b7565a09d: Status 404 returned error can't find the container with id 32749802d947e7200f26f86c7851f41d4af8c81d28a793677076440b7565a09d Apr 24 21:32:55.618704 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:55.618653 2569 generic.go:358] "Generic (PLEG): container finished" podID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerID="424e8e01855f3b40c02e8011503b79a010dedc0d043c5ab042e28c33f016dd0a" exitCode=0 Apr 24 21:32:55.619105 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:55.618743 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" event={"ID":"94a95721-09a5-4ce0-a983-c80aba8d6412","Type":"ContainerDied","Data":"424e8e01855f3b40c02e8011503b79a010dedc0d043c5ab042e28c33f016dd0a"} Apr 24 21:32:55.619105 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:55.618777 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" event={"ID":"94a95721-09a5-4ce0-a983-c80aba8d6412","Type":"ContainerStarted","Data":"32749802d947e7200f26f86c7851f41d4af8c81d28a793677076440b7565a09d"} Apr 24 21:32:56.625746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:56.625628 2569 generic.go:358] "Generic (PLEG): container finished" podID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerID="eff071de1ab4fbdb87bf8540b4c5389396c41ffbb23f4b45c3cd32a153d8d9d8" exitCode=0 Apr 24 21:32:56.625746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:56.625699 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" event={"ID":"94a95721-09a5-4ce0-a983-c80aba8d6412","Type":"ContainerDied","Data":"eff071de1ab4fbdb87bf8540b4c5389396c41ffbb23f4b45c3cd32a153d8d9d8"} Apr 24 21:32:57.630712 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:57.630661 2569 generic.go:358] "Generic (PLEG): container finished" podID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerID="55c749d5a817768816b39a5453398aad816836e01e83d28ac2d3b5f92d636c55" exitCode=0 Apr 24 21:32:57.631059 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:57.630713 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" event={"ID":"94a95721-09a5-4ce0-a983-c80aba8d6412","Type":"ContainerDied","Data":"55c749d5a817768816b39a5453398aad816836e01e83d28ac2d3b5f92d636c55"} Apr 24 21:32:58.754427 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:58.754404 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:58.916708 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:58.916599 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-util\") pod \"94a95721-09a5-4ce0-a983-c80aba8d6412\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " Apr 24 21:32:58.916708 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:58.916647 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-bundle\") pod \"94a95721-09a5-4ce0-a983-c80aba8d6412\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " Apr 24 21:32:58.916708 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:58.916697 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kfzc\" (UniqueName: \"kubernetes.io/projected/94a95721-09a5-4ce0-a983-c80aba8d6412-kube-api-access-6kfzc\") pod \"94a95721-09a5-4ce0-a983-c80aba8d6412\" (UID: \"94a95721-09a5-4ce0-a983-c80aba8d6412\") " Apr 24 21:32:58.917553 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:58.917520 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-bundle" (OuterVolumeSpecName: "bundle") pod "94a95721-09a5-4ce0-a983-c80aba8d6412" (UID: "94a95721-09a5-4ce0-a983-c80aba8d6412"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:58.918966 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:58.918933 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a95721-09a5-4ce0-a983-c80aba8d6412-kube-api-access-6kfzc" (OuterVolumeSpecName: "kube-api-access-6kfzc") pod "94a95721-09a5-4ce0-a983-c80aba8d6412" (UID: "94a95721-09a5-4ce0-a983-c80aba8d6412"). InnerVolumeSpecName "kube-api-access-6kfzc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:58.922387 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:58.922368 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-util" (OuterVolumeSpecName: "util") pod "94a95721-09a5-4ce0-a983-c80aba8d6412" (UID: "94a95721-09a5-4ce0-a983-c80aba8d6412"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:59.017407 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:59.017351 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:59.017407 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:59.017399 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94a95721-09a5-4ce0-a983-c80aba8d6412-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:59.017407 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:59.017412 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6kfzc\" (UniqueName: \"kubernetes.io/projected/94a95721-09a5-4ce0-a983-c80aba8d6412-kube-api-access-6kfzc\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:32:59.638972 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:59.638892 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" event={"ID":"94a95721-09a5-4ce0-a983-c80aba8d6412","Type":"ContainerDied","Data":"32749802d947e7200f26f86c7851f41d4af8c81d28a793677076440b7565a09d"} Apr 24 21:32:59.638972 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:59.638917 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835wkxhl" Apr 24 21:32:59.639148 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:32:59.638924 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32749802d947e7200f26f86c7851f41d4af8c81d28a793677076440b7565a09d" Apr 24 21:33:09.331893 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.331861 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms"] Apr 24 21:33:09.332298 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.332116 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerName="extract" Apr 24 21:33:09.332298 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.332126 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerName="extract" Apr 24 21:33:09.332298 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.332139 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerName="util" Apr 24 21:33:09.332298 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.332145 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerName="util" Apr 24 21:33:09.332298 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.332163 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerName="pull" Apr 24 21:33:09.332298 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.332168 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerName="pull" Apr 24 21:33:09.332298 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.332211 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="94a95721-09a5-4ce0-a983-c80aba8d6412" containerName="extract" Apr 24 21:33:09.336264 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.336243 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.362366 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.362331 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:33:09.362529 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.362435 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-w8zch\"" Apr 24 21:33:09.366246 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.366212 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:33:09.373655 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.370747 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms"] Apr 24 21:33:09.500010 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.499976 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.500010 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.500019 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.500272 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.500065 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gb4t\" (UniqueName: \"kubernetes.io/projected/ea8a456c-c623-4b39-8a84-ab061206214e-kube-api-access-4gb4t\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.601283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.601203 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gb4t\" (UniqueName: \"kubernetes.io/projected/ea8a456c-c623-4b39-8a84-ab061206214e-kube-api-access-4gb4t\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.601283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.601268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.601431 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.601298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.601772 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.601751 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.601814 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.601768 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.635391 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.635356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gb4t\" (UniqueName: \"kubernetes.io/projected/ea8a456c-c623-4b39-8a84-ab061206214e-kube-api-access-4gb4t\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.645138 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.645085 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:09.833641 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:09.833616 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms"] Apr 24 21:33:09.835932 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:33:09.835899 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea8a456c_c623_4b39_8a84_ab061206214e.slice/crio-c7c3defe3f3971408a236052bfeddc28d674ed90c4cc984bdc5677c6f2751ae4 WatchSource:0}: Error finding container c7c3defe3f3971408a236052bfeddc28d674ed90c4cc984bdc5677c6f2751ae4: Status 404 returned error can't find the container with id c7c3defe3f3971408a236052bfeddc28d674ed90c4cc984bdc5677c6f2751ae4 Apr 24 21:33:10.674143 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:10.674112 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea8a456c-c623-4b39-8a84-ab061206214e" containerID="cd1255faf970aaf66360021e7501706a7673f34fdf78fafac55f35c48bbd2422" exitCode=0 Apr 24 21:33:10.674509 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:10.674186 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" event={"ID":"ea8a456c-c623-4b39-8a84-ab061206214e","Type":"ContainerDied","Data":"cd1255faf970aaf66360021e7501706a7673f34fdf78fafac55f35c48bbd2422"} Apr 24 21:33:10.674509 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:10.674211 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" event={"ID":"ea8a456c-c623-4b39-8a84-ab061206214e","Type":"ContainerStarted","Data":"c7c3defe3f3971408a236052bfeddc28d674ed90c4cc984bdc5677c6f2751ae4"} Apr 24 21:33:11.013867 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.013827 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm"] Apr 24 21:33:11.017193 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.017174 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:11.021463 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.021439 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 24 21:33:11.021595 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.021446 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-smn79\"" Apr 24 21:33:11.021595 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.021483 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 24 21:33:11.032108 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.032082 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm"] Apr 24 21:33:11.114113 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.114077 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjz76\" (UniqueName: \"kubernetes.io/projected/a1b13e4c-60bb-4cc7-9861-5317e6a79351-kube-api-access-xjz76\") pod \"servicemesh-operator3-55f49c5f94-hwtmm\" (UID: \"a1b13e4c-60bb-4cc7-9861-5317e6a79351\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:11.114288 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.114136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a1b13e4c-60bb-4cc7-9861-5317e6a79351-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hwtmm\" (UID: \"a1b13e4c-60bb-4cc7-9861-5317e6a79351\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:11.214564 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.214534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a1b13e4c-60bb-4cc7-9861-5317e6a79351-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hwtmm\" (UID: \"a1b13e4c-60bb-4cc7-9861-5317e6a79351\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:11.214749 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.214631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjz76\" (UniqueName: \"kubernetes.io/projected/a1b13e4c-60bb-4cc7-9861-5317e6a79351-kube-api-access-xjz76\") pod \"servicemesh-operator3-55f49c5f94-hwtmm\" (UID: \"a1b13e4c-60bb-4cc7-9861-5317e6a79351\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:11.217177 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.217147 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a1b13e4c-60bb-4cc7-9861-5317e6a79351-operator-config\") pod \"servicemesh-operator3-55f49c5f94-hwtmm\" (UID: \"a1b13e4c-60bb-4cc7-9861-5317e6a79351\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:11.232471 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.232444 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjz76\" (UniqueName: \"kubernetes.io/projected/a1b13e4c-60bb-4cc7-9861-5317e6a79351-kube-api-access-xjz76\") pod \"servicemesh-operator3-55f49c5f94-hwtmm\" (UID: \"a1b13e4c-60bb-4cc7-9861-5317e6a79351\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:11.326254 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.326173 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:11.497552 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.497524 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm"] Apr 24 21:33:11.498867 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:33:11.498836 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b13e4c_60bb_4cc7_9861_5317e6a79351.slice/crio-971599996a571f87d2e7da5bf1907ff82f15aaef773309e113aceead22f7edc7 WatchSource:0}: Error finding container 971599996a571f87d2e7da5bf1907ff82f15aaef773309e113aceead22f7edc7: Status 404 returned error can't find the container with id 971599996a571f87d2e7da5bf1907ff82f15aaef773309e113aceead22f7edc7 Apr 24 21:33:11.682156 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.682065 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea8a456c-c623-4b39-8a84-ab061206214e" containerID="0bbe9575c7e9ab5aa41337c937ee78e2cce103bcf02377593fa8d347cdfa69a2" exitCode=0 Apr 24 21:33:11.682531 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.682165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" event={"ID":"ea8a456c-c623-4b39-8a84-ab061206214e","Type":"ContainerDied","Data":"0bbe9575c7e9ab5aa41337c937ee78e2cce103bcf02377593fa8d347cdfa69a2"} Apr 24 21:33:11.683520 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:11.683497 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" event={"ID":"a1b13e4c-60bb-4cc7-9861-5317e6a79351","Type":"ContainerStarted","Data":"971599996a571f87d2e7da5bf1907ff82f15aaef773309e113aceead22f7edc7"} Apr 24 21:33:12.689335 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:12.689298 2569 generic.go:358] "Generic (PLEG): container finished" podID="ea8a456c-c623-4b39-8a84-ab061206214e" containerID="0f501353697e9b91e4a27aa484e3b29b4401c1748fff362f7a4a0fb990cedbad" exitCode=0 Apr 24 21:33:12.689774 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:12.689386 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" event={"ID":"ea8a456c-c623-4b39-8a84-ab061206214e","Type":"ContainerDied","Data":"0f501353697e9b91e4a27aa484e3b29b4401c1748fff362f7a4a0fb990cedbad"} Apr 24 21:33:13.826522 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:13.826499 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:13.937351 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:13.937303 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-bundle\") pod \"ea8a456c-c623-4b39-8a84-ab061206214e\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " Apr 24 21:33:13.937545 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:13.937361 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gb4t\" (UniqueName: \"kubernetes.io/projected/ea8a456c-c623-4b39-8a84-ab061206214e-kube-api-access-4gb4t\") pod \"ea8a456c-c623-4b39-8a84-ab061206214e\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " Apr 24 21:33:13.937545 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:13.937474 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-util\") pod \"ea8a456c-c623-4b39-8a84-ab061206214e\" (UID: \"ea8a456c-c623-4b39-8a84-ab061206214e\") " Apr 24 21:33:13.939121 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:13.939078 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-bundle" (OuterVolumeSpecName: "bundle") pod "ea8a456c-c623-4b39-8a84-ab061206214e" (UID: "ea8a456c-c623-4b39-8a84-ab061206214e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:13.941510 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:13.941407 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8a456c-c623-4b39-8a84-ab061206214e-kube-api-access-4gb4t" (OuterVolumeSpecName: "kube-api-access-4gb4t") pod "ea8a456c-c623-4b39-8a84-ab061206214e" (UID: "ea8a456c-c623-4b39-8a84-ab061206214e"). InnerVolumeSpecName "kube-api-access-4gb4t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:13.947429 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:13.947387 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-util" (OuterVolumeSpecName: "util") pod "ea8a456c-c623-4b39-8a84-ab061206214e" (UID: "ea8a456c-c623-4b39-8a84-ab061206214e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:14.039114 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.039082 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:14.039114 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.039116 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ea8a456c-c623-4b39-8a84-ab061206214e-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:14.039362 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.039134 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4gb4t\" (UniqueName: \"kubernetes.io/projected/ea8a456c-c623-4b39-8a84-ab061206214e-kube-api-access-4gb4t\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:14.696657 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.696621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" event={"ID":"a1b13e4c-60bb-4cc7-9861-5317e6a79351","Type":"ContainerStarted","Data":"501cf9d882990a8856a1b6c8104b44e4f8d81f5cc90e067311353aac48301570"} Apr 24 21:33:14.696847 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.696757 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:14.698163 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.698139 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" event={"ID":"ea8a456c-c623-4b39-8a84-ab061206214e","Type":"ContainerDied","Data":"c7c3defe3f3971408a236052bfeddc28d674ed90c4cc984bdc5677c6f2751ae4"} Apr 24 21:33:14.698265 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.698176 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7c3defe3f3971408a236052bfeddc28d674ed90c4cc984bdc5677c6f2751ae4" Apr 24 21:33:14.698265 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.698148 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebxpfms" Apr 24 21:33:14.744515 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:14.744458 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" podStartSLOduration=2.474645171 podStartE2EDuration="4.744439256s" podCreationTimestamp="2026-04-24 21:33:10 +0000 UTC" firstStartedPulling="2026-04-24 21:33:11.501471304 +0000 UTC m=+402.741824544" lastFinishedPulling="2026-04-24 21:33:13.771265376 +0000 UTC m=+405.011618629" observedRunningTime="2026-04-24 21:33:14.743652736 +0000 UTC m=+405.984005994" watchObservedRunningTime="2026-04-24 21:33:14.744439256 +0000 UTC m=+405.984792518" Apr 24 21:33:25.704418 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:25.704385 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-hwtmm" Apr 24 21:33:36.727123 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.727087 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s"] Apr 24 21:33:36.727565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.727379 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea8a456c-c623-4b39-8a84-ab061206214e" containerName="pull" Apr 24 21:33:36.727565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.727390 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8a456c-c623-4b39-8a84-ab061206214e" containerName="pull" Apr 24 21:33:36.727565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.727456 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea8a456c-c623-4b39-8a84-ab061206214e" containerName="util" Apr 24 21:33:36.727565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.727463 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8a456c-c623-4b39-8a84-ab061206214e" containerName="util" Apr 24 21:33:36.727565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.727475 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea8a456c-c623-4b39-8a84-ab061206214e" containerName="extract" Apr 24 21:33:36.727565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.727483 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8a456c-c623-4b39-8a84-ab061206214e" containerName="extract" Apr 24 21:33:36.727565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.727559 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea8a456c-c623-4b39-8a84-ab061206214e" containerName="extract" Apr 24 21:33:36.731177 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.731154 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.734319 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.734292 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 21:33:36.734455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.734295 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 24 21:33:36.734455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.734380 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 24 21:33:36.734455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.734316 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:33:36.734455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.734421 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:33:36.734455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.734440 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 24 21:33:36.734804 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.734786 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-v52jc\"" Apr 24 21:33:36.747755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.747727 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s"] Apr 24 21:33:36.820296 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.820246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l55r6\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-kube-api-access-l55r6\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.820488 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.820327 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.820488 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.820374 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.820488 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.820404 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.820488 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.820460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.820719 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.820493 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.820719 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.820528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.921107 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.921059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l55r6\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-kube-api-access-l55r6\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.921283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.921118 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.921283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.921144 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.921283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.921164 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.921283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.921195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.921283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.921215 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.921283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.921242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.921905 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.921879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.923970 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.923943 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.924362 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.924330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.924446 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.924399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.924537 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.924514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.930387 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.930366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:36.930645 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:36.930620 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l55r6\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-kube-api-access-l55r6\") pod \"istiod-openshift-gateway-7cd77c7ffd-w7n5s\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:37.040688 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:37.040642 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:37.179653 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:37.179626 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s"] Apr 24 21:33:37.181740 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:33:37.181706 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0e3cd2_74a2_4b90_97e5_e86d9e48ecdb.slice/crio-957f40773d7ab97071c90d652b8c90b1095709c978bd66c3da1999608e486310 WatchSource:0}: Error finding container 957f40773d7ab97071c90d652b8c90b1095709c978bd66c3da1999608e486310: Status 404 returned error can't find the container with id 957f40773d7ab97071c90d652b8c90b1095709c978bd66c3da1999608e486310 Apr 24 21:33:37.775271 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:37.775236 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" event={"ID":"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb","Type":"ContainerStarted","Data":"957f40773d7ab97071c90d652b8c90b1095709c978bd66c3da1999608e486310"} Apr 24 21:33:39.622700 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:39.622638 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 24 21:33:39.623024 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:39.622747 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 24 21:33:39.784733 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:39.784690 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" event={"ID":"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb","Type":"ContainerStarted","Data":"2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a"} Apr 24 21:33:39.784904 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:39.784805 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:39.806611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:39.806552 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" podStartSLOduration=1.367798324 podStartE2EDuration="3.806537303s" podCreationTimestamp="2026-04-24 21:33:36 +0000 UTC" firstStartedPulling="2026-04-24 21:33:37.183642547 +0000 UTC m=+428.423995800" lastFinishedPulling="2026-04-24 21:33:39.622381539 +0000 UTC m=+430.862734779" observedRunningTime="2026-04-24 21:33:39.803697248 +0000 UTC m=+431.044050510" watchObservedRunningTime="2026-04-24 21:33:39.806537303 +0000 UTC m=+431.046890564" Apr 24 21:33:40.790352 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:40.790324 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:33:42.749585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.749548 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v"] Apr 24 21:33:42.753066 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.753048 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.756544 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.756517 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-lntx2\"" Apr 24 21:33:42.763758 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.763732 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v"] Apr 24 21:33:42.872950 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.872914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.872950 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.872954 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.873230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.872980 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.873230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.873089 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.873230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.873161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.873230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.873189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.873230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.873219 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.873425 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.873245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.873425 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.873296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c54fs\" (UniqueName: \"kubernetes.io/projected/aa08ecbd-1f58-4d2c-a231-3db5090bb227-kube-api-access-c54fs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974243 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974243 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974245 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974492 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c54fs\" (UniqueName: \"kubernetes.io/projected/aa08ecbd-1f58-4d2c-a231-3db5090bb227-kube-api-access-c54fs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974492 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974492 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974488 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974792 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974792 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974716 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.974898 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.975023 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.974997 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.975091 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.975051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.975302 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.975279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.975373 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.975285 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.977210 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.977187 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.977361 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.977342 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.982773 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.982748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aa08ecbd-1f58-4d2c-a231-3db5090bb227-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:42.982901 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:42.982885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c54fs\" (UniqueName: \"kubernetes.io/projected/aa08ecbd-1f58-4d2c-a231-3db5090bb227-kube-api-access-c54fs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5wn4v\" (UID: \"aa08ecbd-1f58-4d2c-a231-3db5090bb227\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:43.065820 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:43.065777 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:43.197459 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:43.197431 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v"] Apr 24 21:33:43.200523 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:33:43.200488 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa08ecbd_1f58_4d2c_a231_3db5090bb227.slice/crio-aa1f4019468195791ba89740ec26432abbfcf861c7b080a09c9aabe91c159db6 WatchSource:0}: Error finding container aa1f4019468195791ba89740ec26432abbfcf861c7b080a09c9aabe91c159db6: Status 404 returned error can't find the container with id aa1f4019468195791ba89740ec26432abbfcf861c7b080a09c9aabe91c159db6 Apr 24 21:33:43.800533 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:43.800496 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" event={"ID":"aa08ecbd-1f58-4d2c-a231-3db5090bb227","Type":"ContainerStarted","Data":"aa1f4019468195791ba89740ec26432abbfcf861c7b080a09c9aabe91c159db6"} Apr 24 21:33:45.752787 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:45.752748 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 24 21:33:45.753194 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:45.752828 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 24 21:33:45.753194 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:45.752871 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 24 21:33:46.818190 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:46.818150 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" event={"ID":"aa08ecbd-1f58-4d2c-a231-3db5090bb227","Type":"ContainerStarted","Data":"b28bc073153b63a208d4f28153b48ab20c4e35ce4604d92622541422592fdf64"} Apr 24 21:33:46.841560 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:46.841511 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" podStartSLOduration=2.291215361 podStartE2EDuration="4.84149151s" podCreationTimestamp="2026-04-24 21:33:42 +0000 UTC" firstStartedPulling="2026-04-24 21:33:43.202166647 +0000 UTC m=+434.442519887" lastFinishedPulling="2026-04-24 21:33:45.752442797 +0000 UTC m=+436.992796036" observedRunningTime="2026-04-24 21:33:46.839728555 +0000 UTC m=+438.080081817" watchObservedRunningTime="2026-04-24 21:33:46.84149151 +0000 UTC m=+438.081844770" Apr 24 21:33:47.066723 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:47.066661 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:47.071199 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:47.071133 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:47.822288 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:47.822217 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:47.823358 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:47.823341 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5wn4v" Apr 24 21:33:52.545229 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.545192 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql"] Apr 24 21:33:52.548426 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.548409 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.551204 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.551181 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:33:52.551408 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.551396 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-w8zch\"" Apr 24 21:33:52.552645 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.552628 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:33:52.561528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.561501 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql"] Apr 24 21:33:52.643814 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.643776 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj"] Apr 24 21:33:52.647056 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.647039 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.657500 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.657475 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj"] Apr 24 21:33:52.662505 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.662480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.662629 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.662542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.662629 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.662615 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26mr\" (UniqueName: \"kubernetes.io/projected/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-kube-api-access-l26mr\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.763779 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.763742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qcvv\" (UniqueName: \"kubernetes.io/projected/9759cc1b-d4a6-49c8-95ee-bc86075198d6-kube-api-access-7qcvv\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.763990 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.763798 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l26mr\" (UniqueName: \"kubernetes.io/projected/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-kube-api-access-l26mr\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.763990 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.763834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.763990 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.763857 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.763990 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.763896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.764180 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.764021 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.764252 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.764232 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.764385 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.764364 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.777200 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.777167 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw"] Apr 24 21:33:52.780648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.780625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.792258 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.792229 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l26mr\" (UniqueName: \"kubernetes.io/projected/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-kube-api-access-l26mr\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.807722 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.807625 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw"] Apr 24 21:33:52.857645 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.857611 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:52.864688 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.864634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qcvv\" (UniqueName: \"kubernetes.io/projected/9759cc1b-d4a6-49c8-95ee-bc86075198d6-kube-api-access-7qcvv\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.864805 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.864701 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.864805 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.864746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.864920 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.864817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.864920 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.864889 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.865004 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.864986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swkdm\" (UniqueName: \"kubernetes.io/projected/d3a0de64-7683-441d-98db-f17d662e4104-kube-api-access-swkdm\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.865135 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.865116 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.865256 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.865240 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.881310 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.881278 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qcvv\" (UniqueName: \"kubernetes.io/projected/9759cc1b-d4a6-49c8-95ee-bc86075198d6-kube-api-access-7qcvv\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.944538 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.944483 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg"] Apr 24 21:33:52.949290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.949263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:52.957392 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.957160 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:52.959878 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.959850 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg"] Apr 24 21:33:52.966001 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.965966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swkdm\" (UniqueName: \"kubernetes.io/projected/d3a0de64-7683-441d-98db-f17d662e4104-kube-api-access-swkdm\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.966163 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.966020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.966163 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.966078 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.966446 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.966427 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.966628 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.966581 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.975570 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.975537 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swkdm\" (UniqueName: \"kubernetes.io/projected/d3a0de64-7683-441d-98db-f17d662e4104-kube-api-access-swkdm\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:52.986598 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:52.986566 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql"] Apr 24 21:33:52.987820 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:33:52.987792 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda698b2e7_1633_4b0d_bcf0_69246f1b39ad.slice/crio-e284594612d3deaa7bdaac1f31ea372a84396aa405be4d806acc3208fb73e9c3 WatchSource:0}: Error finding container e284594612d3deaa7bdaac1f31ea372a84396aa405be4d806acc3208fb73e9c3: Status 404 returned error can't find the container with id e284594612d3deaa7bdaac1f31ea372a84396aa405be4d806acc3208fb73e9c3 Apr 24 21:33:53.067023 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.066994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.067159 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.067055 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc89f\" (UniqueName: \"kubernetes.io/projected/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-kube-api-access-wc89f\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.067159 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.067076 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.089799 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.089762 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:53.102078 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.102053 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj"] Apr 24 21:33:53.105400 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:33:53.105360 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9759cc1b_d4a6_49c8_95ee_bc86075198d6.slice/crio-d8c6c8f16d905d3079f220fe3b6150044b463a97bdc8de0b4555a42f955977db WatchSource:0}: Error finding container d8c6c8f16d905d3079f220fe3b6150044b463a97bdc8de0b4555a42f955977db: Status 404 returned error can't find the container with id d8c6c8f16d905d3079f220fe3b6150044b463a97bdc8de0b4555a42f955977db Apr 24 21:33:53.167912 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.167882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.168028 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.167964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc89f\" (UniqueName: \"kubernetes.io/projected/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-kube-api-access-wc89f\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.168028 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.167987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.168365 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.168336 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.168411 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.168345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.177005 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.176952 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc89f\" (UniqueName: \"kubernetes.io/projected/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-kube-api-access-wc89f\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.226274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.226239 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw"] Apr 24 21:33:53.228043 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:33:53.228012 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3a0de64_7683_441d_98db_f17d662e4104.slice/crio-91c1efc1caf35a1f4483546560cda590b7213daebb565c0a4c5c9075eab6ccf4 WatchSource:0}: Error finding container 91c1efc1caf35a1f4483546560cda590b7213daebb565c0a4c5c9075eab6ccf4: Status 404 returned error can't find the container with id 91c1efc1caf35a1f4483546560cda590b7213daebb565c0a4c5c9075eab6ccf4 Apr 24 21:33:53.262328 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.262293 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:53.399010 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.398940 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg"] Apr 24 21:33:53.430602 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:33:53.430564 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1548f9a5_81d6_4908_a1e9_40b3fd5c8dd5.slice/crio-4b8d78f260239d7b6998bbfe27a0c03e0c963da2809bfbf6f347795c003a358c WatchSource:0}: Error finding container 4b8d78f260239d7b6998bbfe27a0c03e0c963da2809bfbf6f347795c003a358c: Status 404 returned error can't find the container with id 4b8d78f260239d7b6998bbfe27a0c03e0c963da2809bfbf6f347795c003a358c Apr 24 21:33:53.842973 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.842930 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3a0de64-7683-441d-98db-f17d662e4104" containerID="87959250415af688723d37b0d7c1bba2842c2e325be58b1abd3ec6ea1baf85ea" exitCode=0 Apr 24 21:33:53.843434 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.843029 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" event={"ID":"d3a0de64-7683-441d-98db-f17d662e4104","Type":"ContainerDied","Data":"87959250415af688723d37b0d7c1bba2842c2e325be58b1abd3ec6ea1baf85ea"} Apr 24 21:33:53.843434 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.843070 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" event={"ID":"d3a0de64-7683-441d-98db-f17d662e4104","Type":"ContainerStarted","Data":"91c1efc1caf35a1f4483546560cda590b7213daebb565c0a4c5c9075eab6ccf4"} Apr 24 21:33:53.844414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.844393 2569 generic.go:358] "Generic (PLEG): container finished" podID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerID="04506f98637e366cfeeb09fe45a2a7d89baffa0e80b98f17814e26efe2cf22cd" exitCode=0 Apr 24 21:33:53.844575 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.844463 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" event={"ID":"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5","Type":"ContainerDied","Data":"04506f98637e366cfeeb09fe45a2a7d89baffa0e80b98f17814e26efe2cf22cd"} Apr 24 21:33:53.844575 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.844481 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" event={"ID":"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5","Type":"ContainerStarted","Data":"4b8d78f260239d7b6998bbfe27a0c03e0c963da2809bfbf6f347795c003a358c"} Apr 24 21:33:53.845955 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.845934 2569 generic.go:358] "Generic (PLEG): container finished" podID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerID="5cd58e542c83b43e21a9115c4d9fb190ad0da359b9d80d5c00880c73655215e9" exitCode=0 Apr 24 21:33:53.846077 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.846040 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" event={"ID":"a698b2e7-1633-4b0d-bcf0-69246f1b39ad","Type":"ContainerDied","Data":"5cd58e542c83b43e21a9115c4d9fb190ad0da359b9d80d5c00880c73655215e9"} Apr 24 21:33:53.846140 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.846087 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" event={"ID":"a698b2e7-1633-4b0d-bcf0-69246f1b39ad","Type":"ContainerStarted","Data":"e284594612d3deaa7bdaac1f31ea372a84396aa405be4d806acc3208fb73e9c3"} Apr 24 21:33:53.847314 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.847296 2569 generic.go:358] "Generic (PLEG): container finished" podID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerID="cb7e422ded0d701afdc62537f9a9a499ce81e209192118ddad659dd998ded327" exitCode=0 Apr 24 21:33:53.847378 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.847331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" event={"ID":"9759cc1b-d4a6-49c8-95ee-bc86075198d6","Type":"ContainerDied","Data":"cb7e422ded0d701afdc62537f9a9a499ce81e209192118ddad659dd998ded327"} Apr 24 21:33:53.847378 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:53.847346 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" event={"ID":"9759cc1b-d4a6-49c8-95ee-bc86075198d6","Type":"ContainerStarted","Data":"d8c6c8f16d905d3079f220fe3b6150044b463a97bdc8de0b4555a42f955977db"} Apr 24 21:33:54.853965 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:54.853911 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" event={"ID":"9759cc1b-d4a6-49c8-95ee-bc86075198d6","Type":"ContainerStarted","Data":"cd110a1f1faa1738664b5ec351e532c6c505ae9a69ed167d3c73363a065a116b"} Apr 24 21:33:55.859734 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:55.859703 2569 generic.go:358] "Generic (PLEG): container finished" podID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerID="bf5560e5a72979db15024e8325096abbedde5991b4ebcbe41d3bdf33c73cdd90" exitCode=0 Apr 24 21:33:55.860139 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:55.859779 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" event={"ID":"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5","Type":"ContainerDied","Data":"bf5560e5a72979db15024e8325096abbedde5991b4ebcbe41d3bdf33c73cdd90"} Apr 24 21:33:55.861527 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:55.861463 2569 generic.go:358] "Generic (PLEG): container finished" podID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerID="c127ee84791542f5e0040d7fbdfbee89eda7bb529b6392608ab205c8cb4f6c2a" exitCode=0 Apr 24 21:33:55.861577 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:55.861533 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" event={"ID":"a698b2e7-1633-4b0d-bcf0-69246f1b39ad","Type":"ContainerDied","Data":"c127ee84791542f5e0040d7fbdfbee89eda7bb529b6392608ab205c8cb4f6c2a"} Apr 24 21:33:55.863327 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:55.863306 2569 generic.go:358] "Generic (PLEG): container finished" podID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerID="cd110a1f1faa1738664b5ec351e532c6c505ae9a69ed167d3c73363a065a116b" exitCode=0 Apr 24 21:33:55.863424 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:55.863327 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" event={"ID":"9759cc1b-d4a6-49c8-95ee-bc86075198d6","Type":"ContainerDied","Data":"cd110a1f1faa1738664b5ec351e532c6c505ae9a69ed167d3c73363a065a116b"} Apr 24 21:33:55.865141 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:55.865115 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3a0de64-7683-441d-98db-f17d662e4104" containerID="0c10cd8fa84bfd2764b0b86f71c74dde601adc7498700f04681f639e27d3fa8b" exitCode=0 Apr 24 21:33:55.865254 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:55.865173 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" event={"ID":"d3a0de64-7683-441d-98db-f17d662e4104","Type":"ContainerDied","Data":"0c10cd8fa84bfd2764b0b86f71c74dde601adc7498700f04681f639e27d3fa8b"} Apr 24 21:33:56.870937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:56.870902 2569 generic.go:358] "Generic (PLEG): container finished" podID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerID="081cd8862fe07dd090347aa87728eceea2841da817f2a13a6e67a6322d0c09d9" exitCode=0 Apr 24 21:33:56.871309 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:56.870983 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" event={"ID":"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5","Type":"ContainerDied","Data":"081cd8862fe07dd090347aa87728eceea2841da817f2a13a6e67a6322d0c09d9"} Apr 24 21:33:56.872738 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:56.872713 2569 generic.go:358] "Generic (PLEG): container finished" podID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerID="a6dd19817a8f7f4a41437aa4589d40ac4bb6c49df7d006058ba76d71b2ff5510" exitCode=0 Apr 24 21:33:56.872861 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:56.872798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" event={"ID":"a698b2e7-1633-4b0d-bcf0-69246f1b39ad","Type":"ContainerDied","Data":"a6dd19817a8f7f4a41437aa4589d40ac4bb6c49df7d006058ba76d71b2ff5510"} Apr 24 21:33:56.874644 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:56.874620 2569 generic.go:358] "Generic (PLEG): container finished" podID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerID="3b9ca43e31fb0981348d388b5bc045c278004afbfca4275ca3615d4ca97769a6" exitCode=0 Apr 24 21:33:56.874770 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:56.874708 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" event={"ID":"9759cc1b-d4a6-49c8-95ee-bc86075198d6","Type":"ContainerDied","Data":"3b9ca43e31fb0981348d388b5bc045c278004afbfca4275ca3615d4ca97769a6"} Apr 24 21:33:56.876579 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:56.876557 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3a0de64-7683-441d-98db-f17d662e4104" containerID="713a85369d13831fe0e16dc6099df58086f1037c95d1b36dda46a4e747cde5f7" exitCode=0 Apr 24 21:33:56.876660 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:56.876616 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" event={"ID":"d3a0de64-7683-441d-98db-f17d662e4104","Type":"ContainerDied","Data":"713a85369d13831fe0e16dc6099df58086f1037c95d1b36dda46a4e747cde5f7"} Apr 24 21:33:58.004302 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.004282 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:33:58.053271 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.053246 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:58.076732 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.076711 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:58.086761 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.086740 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:58.111776 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.111741 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qcvv\" (UniqueName: \"kubernetes.io/projected/9759cc1b-d4a6-49c8-95ee-bc86075198d6-kube-api-access-7qcvv\") pod \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " Apr 24 21:33:58.111923 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.111826 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-bundle\") pod \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " Apr 24 21:33:58.111923 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.111889 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-util\") pod \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\" (UID: \"9759cc1b-d4a6-49c8-95ee-bc86075198d6\") " Apr 24 21:33:58.112546 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.112464 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-bundle" (OuterVolumeSpecName: "bundle") pod "9759cc1b-d4a6-49c8-95ee-bc86075198d6" (UID: "9759cc1b-d4a6-49c8-95ee-bc86075198d6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.114156 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.114125 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9759cc1b-d4a6-49c8-95ee-bc86075198d6-kube-api-access-7qcvv" (OuterVolumeSpecName: "kube-api-access-7qcvv") pod "9759cc1b-d4a6-49c8-95ee-bc86075198d6" (UID: "9759cc1b-d4a6-49c8-95ee-bc86075198d6"). InnerVolumeSpecName "kube-api-access-7qcvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:58.118406 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.118381 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-util" (OuterVolumeSpecName: "util") pod "9759cc1b-d4a6-49c8-95ee-bc86075198d6" (UID: "9759cc1b-d4a6-49c8-95ee-bc86075198d6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.212960 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.212882 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-bundle\") pod \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " Apr 24 21:33:58.212960 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.212922 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swkdm\" (UniqueName: \"kubernetes.io/projected/d3a0de64-7683-441d-98db-f17d662e4104-kube-api-access-swkdm\") pod \"d3a0de64-7683-441d-98db-f17d662e4104\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " Apr 24 21:33:58.212960 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.212942 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-bundle\") pod \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " Apr 24 21:33:58.212960 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.212957 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc89f\" (UniqueName: \"kubernetes.io/projected/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-kube-api-access-wc89f\") pod \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " Apr 24 21:33:58.213265 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.212981 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l26mr\" (UniqueName: \"kubernetes.io/projected/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-kube-api-access-l26mr\") pod \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " Apr 24 21:33:58.213265 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213023 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-bundle\") pod \"d3a0de64-7683-441d-98db-f17d662e4104\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " Apr 24 21:33:58.213265 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213040 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-util\") pod \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\" (UID: \"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5\") " Apr 24 21:33:58.213265 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213101 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-util\") pod \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\" (UID: \"a698b2e7-1633-4b0d-bcf0-69246f1b39ad\") " Apr 24 21:33:58.213265 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213127 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-util\") pod \"d3a0de64-7683-441d-98db-f17d662e4104\" (UID: \"d3a0de64-7683-441d-98db-f17d662e4104\") " Apr 24 21:33:58.213498 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213399 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.213498 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213424 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9759cc1b-d4a6-49c8-95ee-bc86075198d6-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.213498 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213439 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qcvv\" (UniqueName: \"kubernetes.io/projected/9759cc1b-d4a6-49c8-95ee-bc86075198d6-kube-api-access-7qcvv\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.213651 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213589 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-bundle" (OuterVolumeSpecName: "bundle") pod "a698b2e7-1633-4b0d-bcf0-69246f1b39ad" (UID: "a698b2e7-1633-4b0d-bcf0-69246f1b39ad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.213832 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.213810 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-bundle" (OuterVolumeSpecName: "bundle") pod "1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" (UID: "1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.214322 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.214288 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-bundle" (OuterVolumeSpecName: "bundle") pod "d3a0de64-7683-441d-98db-f17d662e4104" (UID: "d3a0de64-7683-441d-98db-f17d662e4104"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.215499 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.215469 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a0de64-7683-441d-98db-f17d662e4104-kube-api-access-swkdm" (OuterVolumeSpecName: "kube-api-access-swkdm") pod "d3a0de64-7683-441d-98db-f17d662e4104" (UID: "d3a0de64-7683-441d-98db-f17d662e4104"). InnerVolumeSpecName "kube-api-access-swkdm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:58.215646 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.215628 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-kube-api-access-l26mr" (OuterVolumeSpecName: "kube-api-access-l26mr") pod "a698b2e7-1633-4b0d-bcf0-69246f1b39ad" (UID: "a698b2e7-1633-4b0d-bcf0-69246f1b39ad"). InnerVolumeSpecName "kube-api-access-l26mr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:58.215729 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.215712 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-kube-api-access-wc89f" (OuterVolumeSpecName: "kube-api-access-wc89f") pod "1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" (UID: "1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5"). InnerVolumeSpecName "kube-api-access-wc89f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:58.219652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.219624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-util" (OuterVolumeSpecName: "util") pod "a698b2e7-1633-4b0d-bcf0-69246f1b39ad" (UID: "a698b2e7-1633-4b0d-bcf0-69246f1b39ad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.219799 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.219778 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-util" (OuterVolumeSpecName: "util") pod "1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" (UID: "1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.220038 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.220021 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-util" (OuterVolumeSpecName: "util") pod "d3a0de64-7683-441d-98db-f17d662e4104" (UID: "d3a0de64-7683-441d-98db-f17d662e4104"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:33:58.313883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313847 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.313883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313881 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-swkdm\" (UniqueName: \"kubernetes.io/projected/d3a0de64-7683-441d-98db-f17d662e4104-kube-api-access-swkdm\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.313883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313892 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.314070 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313901 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wc89f\" (UniqueName: \"kubernetes.io/projected/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-kube-api-access-wc89f\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.314070 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313910 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l26mr\" (UniqueName: \"kubernetes.io/projected/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-kube-api-access-l26mr\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.314070 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313919 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.314070 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313928 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.314070 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313936 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a698b2e7-1633-4b0d-bcf0-69246f1b39ad-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.314070 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.313944 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3a0de64-7683-441d-98db-f17d662e4104-util\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:33:58.887720 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.886562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" event={"ID":"d3a0de64-7683-441d-98db-f17d662e4104","Type":"ContainerDied","Data":"91c1efc1caf35a1f4483546560cda590b7213daebb565c0a4c5c9075eab6ccf4"} Apr 24 21:33:58.887720 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.886616 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91c1efc1caf35a1f4483546560cda590b7213daebb565c0a4c5c9075eab6ccf4" Apr 24 21:33:58.887720 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.886778 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bqx9lw" Apr 24 21:33:58.893535 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.893502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" event={"ID":"1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5","Type":"ContainerDied","Data":"4b8d78f260239d7b6998bbfe27a0c03e0c963da2809bfbf6f347795c003a358c"} Apr 24 21:33:58.893535 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.893538 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b8d78f260239d7b6998bbfe27a0c03e0c963da2809bfbf6f347795c003a358c" Apr 24 21:33:58.893775 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.893545 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503b9plg" Apr 24 21:33:58.895324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.895298 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" event={"ID":"a698b2e7-1633-4b0d-bcf0-69246f1b39ad","Type":"ContainerDied","Data":"e284594612d3deaa7bdaac1f31ea372a84396aa405be4d806acc3208fb73e9c3"} Apr 24 21:33:58.895451 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.895328 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e284594612d3deaa7bdaac1f31ea372a84396aa405be4d806acc3208fb73e9c3" Apr 24 21:33:58.895451 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.895334 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88d84ql" Apr 24 21:33:58.897463 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.897194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" event={"ID":"9759cc1b-d4a6-49c8-95ee-bc86075198d6","Type":"ContainerDied","Data":"d8c6c8f16d905d3079f220fe3b6150044b463a97bdc8de0b4555a42f955977db"} Apr 24 21:33:58.897463 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.897220 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8c6c8f16d905d3079f220fe3b6150044b463a97bdc8de0b4555a42f955977db" Apr 24 21:33:58.897463 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:33:58.897266 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c304tjxj" Apr 24 21:34:06.080045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.079999 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9"] Apr 24 21:34:06.080512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080445 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerName="util" Apr 24 21:34:06.080512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080463 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerName="util" Apr 24 21:34:06.080512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080472 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3a0de64-7683-441d-98db-f17d662e4104" containerName="pull" Apr 24 21:34:06.080512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080480 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a0de64-7683-441d-98db-f17d662e4104" containerName="pull" Apr 24 21:34:06.080512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080491 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerName="util" Apr 24 21:34:06.080512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080499 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerName="util" Apr 24 21:34:06.080512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080509 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3a0de64-7683-441d-98db-f17d662e4104" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080517 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a0de64-7683-441d-98db-f17d662e4104" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080529 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerName="util" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080537 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerName="util" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080548 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080558 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080571 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3a0de64-7683-441d-98db-f17d662e4104" containerName="util" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080579 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a0de64-7683-441d-98db-f17d662e4104" containerName="util" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080588 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerName="pull" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080596 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerName="pull" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080615 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerName="pull" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080622 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerName="pull" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080632 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerName="pull" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080640 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerName="pull" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080650 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080660 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080696 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080705 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080791 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a698b2e7-1633-4b0d-bcf0-69246f1b39ad" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080803 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3a0de64-7683-441d-98db-f17d662e4104" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080811 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9759cc1b-d4a6-49c8-95ee-bc86075198d6" containerName="extract" Apr 24 21:34:06.080883 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.080823 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1548f9a5-81d6-4908-a1e9-40b3fd5c8dd5" containerName="extract" Apr 24 21:34:06.085433 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.085410 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" Apr 24 21:34:06.089415 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.089394 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-dm6x2\"" Apr 24 21:34:06.089837 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.089821 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 21:34:06.090710 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.090691 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 21:34:06.106096 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.106074 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9"] Apr 24 21:34:06.280273 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.280244 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwpp6\" (UniqueName: \"kubernetes.io/projected/16fb4c1f-74d0-4dd7-a780-5de816a3d86d-kube-api-access-qwpp6\") pod \"limitador-operator-controller-manager-c7fb4c8d5-499d9\" (UID: \"16fb4c1f-74d0-4dd7-a780-5de816a3d86d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" Apr 24 21:34:06.381455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.381363 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwpp6\" (UniqueName: \"kubernetes.io/projected/16fb4c1f-74d0-4dd7-a780-5de816a3d86d-kube-api-access-qwpp6\") pod \"limitador-operator-controller-manager-c7fb4c8d5-499d9\" (UID: \"16fb4c1f-74d0-4dd7-a780-5de816a3d86d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" Apr 24 21:34:06.404814 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.404783 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwpp6\" (UniqueName: \"kubernetes.io/projected/16fb4c1f-74d0-4dd7-a780-5de816a3d86d-kube-api-access-qwpp6\") pod \"limitador-operator-controller-manager-c7fb4c8d5-499d9\" (UID: \"16fb4c1f-74d0-4dd7-a780-5de816a3d86d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" Apr 24 21:34:06.695517 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.695424 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" Apr 24 21:34:06.875688 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.875457 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9"] Apr 24 21:34:06.878592 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:34:06.878562 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fb4c1f_74d0_4dd7_a780_5de816a3d86d.slice/crio-218449f93204f6b3f29130eab5745ff9f835e2655ae9c8bb4358a35c84201eb2 WatchSource:0}: Error finding container 218449f93204f6b3f29130eab5745ff9f835e2655ae9c8bb4358a35c84201eb2: Status 404 returned error can't find the container with id 218449f93204f6b3f29130eab5745ff9f835e2655ae9c8bb4358a35c84201eb2 Apr 24 21:34:06.926096 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:06.926060 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" event={"ID":"16fb4c1f-74d0-4dd7-a780-5de816a3d86d","Type":"ContainerStarted","Data":"218449f93204f6b3f29130eab5745ff9f835e2655ae9c8bb4358a35c84201eb2"} Apr 24 21:34:07.420421 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:07.420389 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67fbd788db-nrgbl"] Apr 24 21:34:09.939735 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:09.939699 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" event={"ID":"16fb4c1f-74d0-4dd7-a780-5de816a3d86d","Type":"ContainerStarted","Data":"a4111190ac68e8f162c257d37c26a8284a094d8577a188b9b6540413e32ce98a"} Apr 24 21:34:09.940178 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:09.939876 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" Apr 24 21:34:10.005144 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:10.005086 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" podStartSLOduration=1.6071192330000001 podStartE2EDuration="4.005069111s" podCreationTimestamp="2026-04-24 21:34:06 +0000 UTC" firstStartedPulling="2026-04-24 21:34:06.880592959 +0000 UTC m=+458.120946199" lastFinishedPulling="2026-04-24 21:34:09.278542835 +0000 UTC m=+460.518896077" observedRunningTime="2026-04-24 21:34:10.002985579 +0000 UTC m=+461.243338840" watchObservedRunningTime="2026-04-24 21:34:10.005069111 +0000 UTC m=+461.245422371" Apr 24 21:34:14.141844 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.141807 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cn95g"] Apr 24 21:34:14.150039 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.150016 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" Apr 24 21:34:14.154296 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.154268 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-jzqv9\"" Apr 24 21:34:14.168721 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.167587 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cn95g"] Apr 24 21:34:14.234319 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.234288 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ltbg\" (UniqueName: \"kubernetes.io/projected/aaedb57f-ea16-416f-9886-ed7e3462f547-kube-api-access-5ltbg\") pod \"authorino-operator-7587b89b76-cn95g\" (UID: \"aaedb57f-ea16-416f-9886-ed7e3462f547\") " pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" Apr 24 21:34:14.334711 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.334653 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ltbg\" (UniqueName: \"kubernetes.io/projected/aaedb57f-ea16-416f-9886-ed7e3462f547-kube-api-access-5ltbg\") pod \"authorino-operator-7587b89b76-cn95g\" (UID: \"aaedb57f-ea16-416f-9886-ed7e3462f547\") " pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" Apr 24 21:34:14.348126 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.348100 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ltbg\" (UniqueName: \"kubernetes.io/projected/aaedb57f-ea16-416f-9886-ed7e3462f547-kube-api-access-5ltbg\") pod \"authorino-operator-7587b89b76-cn95g\" (UID: \"aaedb57f-ea16-416f-9886-ed7e3462f547\") " pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" Apr 24 21:34:14.460305 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.460228 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" Apr 24 21:34:14.592586 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.592561 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cn95g"] Apr 24 21:34:14.597234 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:34:14.597198 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaedb57f_ea16_416f_9886_ed7e3462f547.slice/crio-ed19b2a71fa222c221c2b11aa6d2b9c11519d52d0c6d928a5fdb5cdd9f2ca203 WatchSource:0}: Error finding container ed19b2a71fa222c221c2b11aa6d2b9c11519d52d0c6d928a5fdb5cdd9f2ca203: Status 404 returned error can't find the container with id ed19b2a71fa222c221c2b11aa6d2b9c11519d52d0c6d928a5fdb5cdd9f2ca203 Apr 24 21:34:14.958911 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:14.958879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" event={"ID":"aaedb57f-ea16-416f-9886-ed7e3462f547","Type":"ContainerStarted","Data":"ed19b2a71fa222c221c2b11aa6d2b9c11519d52d0c6d928a5fdb5cdd9f2ca203"} Apr 24 21:34:16.967647 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:16.967613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" event={"ID":"aaedb57f-ea16-416f-9886-ed7e3462f547","Type":"ContainerStarted","Data":"fdf19f39277542b84587622ded6a32dee1ddc0c747732eb1d2e9844c9b019cfd"} Apr 24 21:34:16.968032 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:16.967705 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" Apr 24 21:34:17.000288 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.000237 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" podStartSLOduration=1.273496494 podStartE2EDuration="3.000223219s" podCreationTimestamp="2026-04-24 21:34:14 +0000 UTC" firstStartedPulling="2026-04-24 21:34:14.598811057 +0000 UTC m=+465.839164296" lastFinishedPulling="2026-04-24 21:34:16.325537778 +0000 UTC m=+467.565891021" observedRunningTime="2026-04-24 21:34:16.996083268 +0000 UTC m=+468.236436528" watchObservedRunningTime="2026-04-24 21:34:17.000223219 +0000 UTC m=+468.240576478" Apr 24 21:34:17.408232 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.408199 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h"] Apr 24 21:34:17.411682 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.411651 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" Apr 24 21:34:17.416184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.416161 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-dlvk6\"" Apr 24 21:34:17.416296 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.416193 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 24 21:34:17.445647 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.445622 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h"] Apr 24 21:34:17.462581 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.462547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp9kx\" (UniqueName: \"kubernetes.io/projected/38cd02a3-c2aa-46cd-b870-a44a5cd71fe9-kube-api-access-fp9kx\") pod \"dns-operator-controller-manager-844548ff4c-b9b2h\" (UID: \"38cd02a3-c2aa-46cd-b870-a44a5cd71fe9\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" Apr 24 21:34:17.563327 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.563291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp9kx\" (UniqueName: \"kubernetes.io/projected/38cd02a3-c2aa-46cd-b870-a44a5cd71fe9-kube-api-access-fp9kx\") pod \"dns-operator-controller-manager-844548ff4c-b9b2h\" (UID: \"38cd02a3-c2aa-46cd-b870-a44a5cd71fe9\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" Apr 24 21:34:17.574360 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.574331 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp9kx\" (UniqueName: \"kubernetes.io/projected/38cd02a3-c2aa-46cd-b870-a44a5cd71fe9-kube-api-access-fp9kx\") pod \"dns-operator-controller-manager-844548ff4c-b9b2h\" (UID: \"38cd02a3-c2aa-46cd-b870-a44a5cd71fe9\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" Apr 24 21:34:17.721550 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.721455 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" Apr 24 21:34:17.867494 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.867470 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h"] Apr 24 21:34:17.868757 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:34:17.868726 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38cd02a3_c2aa_46cd_b870_a44a5cd71fe9.slice/crio-d756f95cae0531eb3a991422a344af516a1d391c26b885000e1473e553a680f6 WatchSource:0}: Error finding container d756f95cae0531eb3a991422a344af516a1d391c26b885000e1473e553a680f6: Status 404 returned error can't find the container with id d756f95cae0531eb3a991422a344af516a1d391c26b885000e1473e553a680f6 Apr 24 21:34:17.972603 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:17.972525 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" event={"ID":"38cd02a3-c2aa-46cd-b870-a44a5cd71fe9","Type":"ContainerStarted","Data":"d756f95cae0531eb3a991422a344af516a1d391c26b885000e1473e553a680f6"} Apr 24 21:34:19.981353 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:19.981247 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" event={"ID":"38cd02a3-c2aa-46cd-b870-a44a5cd71fe9","Type":"ContainerStarted","Data":"3c502701f473b6b662b880ad272baaa9afde733926d9405d01c53602b0a62695"} Apr 24 21:34:19.981792 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:19.981354 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" Apr 24 21:34:20.028430 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:20.028377 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" podStartSLOduration=1.28503457 podStartE2EDuration="3.028364556s" podCreationTimestamp="2026-04-24 21:34:17 +0000 UTC" firstStartedPulling="2026-04-24 21:34:17.870972997 +0000 UTC m=+469.111326237" lastFinishedPulling="2026-04-24 21:34:19.614302981 +0000 UTC m=+470.854656223" observedRunningTime="2026-04-24 21:34:20.024215443 +0000 UTC m=+471.264568703" watchObservedRunningTime="2026-04-24 21:34:20.028364556 +0000 UTC m=+471.268717816" Apr 24 21:34:20.946008 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:20.945976 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-499d9" Apr 24 21:34:27.975390 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:27.975359 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-cn95g" Apr 24 21:34:30.988747 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:30.988711 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-b9b2h" Apr 24 21:34:32.439429 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.439389 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67fbd788db-nrgbl" podUID="738082d9-1cd9-449c-a217-e889775afdaa" containerName="console" containerID="cri-o://d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33" gracePeriod=15 Apr 24 21:34:32.685376 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.685356 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67fbd788db-nrgbl_738082d9-1cd9-449c-a217-e889775afdaa/console/0.log" Apr 24 21:34:32.685484 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.685420 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:34:32.796711 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.796652 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-service-ca\") pod \"738082d9-1cd9-449c-a217-e889775afdaa\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " Apr 24 21:34:32.796711 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.796715 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-oauth-config\") pod \"738082d9-1cd9-449c-a217-e889775afdaa\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " Apr 24 21:34:32.796915 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.796820 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-serving-cert\") pod \"738082d9-1cd9-449c-a217-e889775afdaa\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " Apr 24 21:34:32.796915 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.796870 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-trusted-ca-bundle\") pod \"738082d9-1cd9-449c-a217-e889775afdaa\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " Apr 24 21:34:32.796915 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.796900 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkmgc\" (UniqueName: \"kubernetes.io/projected/738082d9-1cd9-449c-a217-e889775afdaa-kube-api-access-xkmgc\") pod \"738082d9-1cd9-449c-a217-e889775afdaa\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " Apr 24 21:34:32.797054 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.796921 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-console-config\") pod \"738082d9-1cd9-449c-a217-e889775afdaa\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " Apr 24 21:34:32.797054 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.796961 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-oauth-serving-cert\") pod \"738082d9-1cd9-449c-a217-e889775afdaa\" (UID: \"738082d9-1cd9-449c-a217-e889775afdaa\") " Apr 24 21:34:32.797252 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.797230 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-service-ca" (OuterVolumeSpecName: "service-ca") pod "738082d9-1cd9-449c-a217-e889775afdaa" (UID: "738082d9-1cd9-449c-a217-e889775afdaa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:32.797453 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.797424 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-console-config" (OuterVolumeSpecName: "console-config") pod "738082d9-1cd9-449c-a217-e889775afdaa" (UID: "738082d9-1cd9-449c-a217-e889775afdaa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:32.797453 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.797436 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "738082d9-1cd9-449c-a217-e889775afdaa" (UID: "738082d9-1cd9-449c-a217-e889775afdaa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:32.797692 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.797513 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "738082d9-1cd9-449c-a217-e889775afdaa" (UID: "738082d9-1cd9-449c-a217-e889775afdaa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:32.799192 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.799171 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738082d9-1cd9-449c-a217-e889775afdaa-kube-api-access-xkmgc" (OuterVolumeSpecName: "kube-api-access-xkmgc") pod "738082d9-1cd9-449c-a217-e889775afdaa" (UID: "738082d9-1cd9-449c-a217-e889775afdaa"). InnerVolumeSpecName "kube-api-access-xkmgc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:32.799437 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.799421 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "738082d9-1cd9-449c-a217-e889775afdaa" (UID: "738082d9-1cd9-449c-a217-e889775afdaa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:32.799507 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.799446 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "738082d9-1cd9-449c-a217-e889775afdaa" (UID: "738082d9-1cd9-449c-a217-e889775afdaa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:32.898370 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.898333 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-service-ca\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:34:32.898370 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.898364 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-oauth-config\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:34:32.898370 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.898374 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738082d9-1cd9-449c-a217-e889775afdaa-console-serving-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:34:32.898565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.898383 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-trusted-ca-bundle\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:34:32.898565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.898392 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xkmgc\" (UniqueName: \"kubernetes.io/projected/738082d9-1cd9-449c-a217-e889775afdaa-kube-api-access-xkmgc\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:34:32.898565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.898402 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-console-config\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:34:32.898565 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:32.898410 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738082d9-1cd9-449c-a217-e889775afdaa-oauth-serving-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:34:33.037131 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.037106 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67fbd788db-nrgbl_738082d9-1cd9-449c-a217-e889775afdaa/console/0.log" Apr 24 21:34:33.037290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.037147 2569 generic.go:358] "Generic (PLEG): container finished" podID="738082d9-1cd9-449c-a217-e889775afdaa" containerID="d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33" exitCode=2 Apr 24 21:34:33.037290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.037190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67fbd788db-nrgbl" event={"ID":"738082d9-1cd9-449c-a217-e889775afdaa","Type":"ContainerDied","Data":"d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33"} Apr 24 21:34:33.037290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.037214 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67fbd788db-nrgbl" event={"ID":"738082d9-1cd9-449c-a217-e889775afdaa","Type":"ContainerDied","Data":"348b6ac4fd471ecbcaf35802ff507419630cbaf39ced3e7a156f0b835c71b695"} Apr 24 21:34:33.037290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.037229 2569 scope.go:117] "RemoveContainer" containerID="d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33" Apr 24 21:34:33.037290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.037224 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67fbd788db-nrgbl" Apr 24 21:34:33.046589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.046566 2569 scope.go:117] "RemoveContainer" containerID="d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33" Apr 24 21:34:33.046896 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:34:33.046851 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33\": container with ID starting with d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33 not found: ID does not exist" containerID="d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33" Apr 24 21:34:33.046896 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.046879 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33"} err="failed to get container status \"d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33\": rpc error: code = NotFound desc = could not find container \"d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33\": container with ID starting with d63108eab796f434550dc6b1cf746e01ac8b6ee009e846d30d478394b8de4d33 not found: ID does not exist" Apr 24 21:34:33.066453 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.066426 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67fbd788db-nrgbl"] Apr 24 21:34:33.072342 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.072319 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67fbd788db-nrgbl"] Apr 24 21:34:33.332548 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:34:33.332469 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738082d9-1cd9-449c-a217-e889775afdaa" path="/var/lib/kubelet/pods/738082d9-1cd9-449c-a217-e889775afdaa/volumes" Apr 24 21:35:03.333314 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.333275 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-nw8j6"] Apr 24 21:35:03.333992 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.333969 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="738082d9-1cd9-449c-a217-e889775afdaa" containerName="console" Apr 24 21:35:03.334062 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.333996 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="738082d9-1cd9-449c-a217-e889775afdaa" containerName="console" Apr 24 21:35:03.334180 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.334167 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="738082d9-1cd9-449c-a217-e889775afdaa" containerName="console" Apr 24 21:35:03.340113 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.340062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:03.342524 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.342498 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-nw8j6"] Apr 24 21:35:03.343276 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.343256 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ps4w2\"" Apr 24 21:35:03.343375 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.343313 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 21:35:03.379533 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.379492 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-nw8j6"] Apr 24 21:35:03.448937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.448889 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72wz\" (UniqueName: \"kubernetes.io/projected/04c606e3-591e-4913-844a-50bbb025eaad-kube-api-access-s72wz\") pod \"limitador-limitador-67566c68b4-nw8j6\" (UID: \"04c606e3-591e-4913-844a-50bbb025eaad\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:03.448937 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.448936 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/04c606e3-591e-4913-844a-50bbb025eaad-config-file\") pod \"limitador-limitador-67566c68b4-nw8j6\" (UID: \"04c606e3-591e-4913-844a-50bbb025eaad\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:03.550044 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.550006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s72wz\" (UniqueName: \"kubernetes.io/projected/04c606e3-591e-4913-844a-50bbb025eaad-kube-api-access-s72wz\") pod \"limitador-limitador-67566c68b4-nw8j6\" (UID: \"04c606e3-591e-4913-844a-50bbb025eaad\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:03.550044 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.550046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/04c606e3-591e-4913-844a-50bbb025eaad-config-file\") pod \"limitador-limitador-67566c68b4-nw8j6\" (UID: \"04c606e3-591e-4913-844a-50bbb025eaad\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:03.550638 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.550614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/04c606e3-591e-4913-844a-50bbb025eaad-config-file\") pod \"limitador-limitador-67566c68b4-nw8j6\" (UID: \"04c606e3-591e-4913-844a-50bbb025eaad\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:03.559597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.559567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72wz\" (UniqueName: \"kubernetes.io/projected/04c606e3-591e-4913-844a-50bbb025eaad-kube-api-access-s72wz\") pod \"limitador-limitador-67566c68b4-nw8j6\" (UID: \"04c606e3-591e-4913-844a-50bbb025eaad\") " pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:03.651581 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.651491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:03.784965 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:03.784935 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-nw8j6"] Apr 24 21:35:03.786144 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:35:03.786116 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c606e3_591e_4913_844a_50bbb025eaad.slice/crio-ff6b81eedf9a90b8b7d10d54a57bcc685c05f447fe96de1fdf601f33103d506a WatchSource:0}: Error finding container ff6b81eedf9a90b8b7d10d54a57bcc685c05f447fe96de1fdf601f33103d506a: Status 404 returned error can't find the container with id ff6b81eedf9a90b8b7d10d54a57bcc685c05f447fe96de1fdf601f33103d506a Apr 24 21:35:04.151460 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:04.151424 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" event={"ID":"04c606e3-591e-4913-844a-50bbb025eaad","Type":"ContainerStarted","Data":"ff6b81eedf9a90b8b7d10d54a57bcc685c05f447fe96de1fdf601f33103d506a"} Apr 24 21:35:08.168742 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:08.168703 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" event={"ID":"04c606e3-591e-4913-844a-50bbb025eaad","Type":"ContainerStarted","Data":"21eea9455ab23d05713e703b18f08aa2b359f458a982ae3e2f658f7aa9f64c8d"} Apr 24 21:35:08.169124 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:08.168833 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:08.191911 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:08.191860 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" podStartSLOduration=1.265801686 podStartE2EDuration="5.191846833s" podCreationTimestamp="2026-04-24 21:35:03 +0000 UTC" firstStartedPulling="2026-04-24 21:35:03.788769573 +0000 UTC m=+515.029122812" lastFinishedPulling="2026-04-24 21:35:07.714814718 +0000 UTC m=+518.955167959" observedRunningTime="2026-04-24 21:35:08.190479821 +0000 UTC m=+519.430833083" watchObservedRunningTime="2026-04-24 21:35:08.191846833 +0000 UTC m=+519.432200137" Apr 24 21:35:19.174648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:19.174567 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-nw8j6" Apr 24 21:35:42.614249 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.614204 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s"] Apr 24 21:35:42.614816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.614533 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" podUID="6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" containerName="discovery" containerID="cri-o://2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a" gracePeriod=30 Apr 24 21:35:42.868585 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.868520 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:35:42.982648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.982616 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-local-certs\") pod \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " Apr 24 21:35:42.982648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.982654 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-token\") pod \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " Apr 24 21:35:42.982925 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.982776 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-kubeconfig\") pod \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " Apr 24 21:35:42.982925 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.982821 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-dns-cert\") pod \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " Apr 24 21:35:42.982925 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.982848 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l55r6\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-kube-api-access-l55r6\") pod \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " Apr 24 21:35:42.982925 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.982875 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-cacerts\") pod \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " Apr 24 21:35:42.982925 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.982908 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-ca-configmap\") pod \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\" (UID: \"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb\") " Apr 24 21:35:42.983724 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.983690 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" (UID: "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:42.985373 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.985345 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" (UID: "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:42.985477 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.985412 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-token" (OuterVolumeSpecName: "istio-token") pod "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" (UID: "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:42.985715 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.985691 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-local-certs" (OuterVolumeSpecName: "local-certs") pod "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" (UID: "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:35:42.985974 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.985945 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-kube-api-access-l55r6" (OuterVolumeSpecName: "kube-api-access-l55r6") pod "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" (UID: "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb"). InnerVolumeSpecName "kube-api-access-l55r6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:42.986075 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.986039 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-cacerts" (OuterVolumeSpecName: "cacerts") pod "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" (UID: "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:42.986858 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:42.986837 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" (UID: "6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:43.083920 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.083883 2569 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-kubeconfig\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.083920 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.083913 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-dns-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.083920 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.083925 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l55r6\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-kube-api-access-l55r6\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.084160 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.083935 2569 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-cacerts\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.084160 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.083944 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-csr-ca-configmap\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.084160 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.083953 2569 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-local-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.084160 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.083961 2569 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb-istio-token\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:35:43.307817 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.307784 2569 generic.go:358] "Generic (PLEG): container finished" podID="6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" containerID="2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a" exitCode=0 Apr 24 21:35:43.307987 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.307881 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" Apr 24 21:35:43.307987 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.307882 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" event={"ID":"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb","Type":"ContainerDied","Data":"2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a"} Apr 24 21:35:43.307987 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.307934 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s" event={"ID":"6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb","Type":"ContainerDied","Data":"957f40773d7ab97071c90d652b8c90b1095709c978bd66c3da1999608e486310"} Apr 24 21:35:43.307987 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.307954 2569 scope.go:117] "RemoveContainer" containerID="2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a" Apr 24 21:35:43.318512 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.318496 2569 scope.go:117] "RemoveContainer" containerID="2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a" Apr 24 21:35:43.318790 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:35:43.318770 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a\": container with ID starting with 2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a not found: ID does not exist" containerID="2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a" Apr 24 21:35:43.318854 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.318798 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a"} err="failed to get container status \"2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a\": rpc error: code = NotFound desc = could not find container \"2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a\": container with ID starting with 2412df1c649273d18b641b79138c40712c23c3ea74c297fcf0e650c299ed267a not found: ID does not exist" Apr 24 21:35:43.335916 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.335886 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s"] Apr 24 21:35:43.344651 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:43.344623 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-w7n5s"] Apr 24 21:35:45.332550 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:45.332518 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" path="/var/lib/kubelet/pods/6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb/volumes" Apr 24 21:35:48.981471 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.981441 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84b6647887-j9pnr"] Apr 24 21:35:48.981914 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.981855 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" containerName="discovery" Apr 24 21:35:48.981914 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.981873 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" containerName="discovery" Apr 24 21:35:48.982043 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.981939 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e0e3cd2-74a2-4b90-97e5-e86d9e48ecdb" containerName="discovery" Apr 24 21:35:48.986529 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.986507 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:48.990502 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.990476 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:35:48.990711 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.990484 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:35:48.990711 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.990503 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:35:48.990870 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.990535 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-vwx2n\"" Apr 24 21:35:48.998638 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:48.998611 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-j9pnr"] Apr 24 21:35:49.005687 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.005641 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-85ccfc4685-l5fj5"] Apr 24 21:35:49.009120 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.009097 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:49.011930 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.011904 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-sq9nj\"" Apr 24 21:35:49.012108 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.012050 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:35:49.022537 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.022510 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-85ccfc4685-l5fj5"] Apr 24 21:35:49.039139 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.039112 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-fx2vc"] Apr 24 21:35:49.043075 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.043059 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:49.045981 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.045959 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:35:49.046094 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.046009 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6rszr\"" Apr 24 21:35:49.052169 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.052145 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-fx2vc"] Apr 24 21:35:49.131994 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.131955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dfb56860-3914-407b-874f-fced08269626-data\") pod \"seaweedfs-86cc847c5c-fx2vc\" (UID: \"dfb56860-3914-407b-874f-fced08269626\") " pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:49.132180 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.132008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-277gz\" (UniqueName: \"kubernetes.io/projected/bc0414db-20b9-4b50-b5cb-b914c3dd9086-kube-api-access-277gz\") pod \"kserve-controller-manager-84b6647887-j9pnr\" (UID: \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\") " pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:49.132180 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.132099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27f39f5f-b64b-482b-b684-9c573028ee21-cert\") pod \"llmisvc-controller-manager-85ccfc4685-l5fj5\" (UID: \"27f39f5f-b64b-482b-b684-9c573028ee21\") " pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:49.132180 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.132132 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc0414db-20b9-4b50-b5cb-b914c3dd9086-cert\") pod \"kserve-controller-manager-84b6647887-j9pnr\" (UID: \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\") " pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:49.132180 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.132171 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7vmb\" (UniqueName: \"kubernetes.io/projected/27f39f5f-b64b-482b-b684-9c573028ee21-kube-api-access-h7vmb\") pod \"llmisvc-controller-manager-85ccfc4685-l5fj5\" (UID: \"27f39f5f-b64b-482b-b684-9c573028ee21\") " pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:49.132321 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.132195 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvjsf\" (UniqueName: \"kubernetes.io/projected/dfb56860-3914-407b-874f-fced08269626-kube-api-access-vvjsf\") pod \"seaweedfs-86cc847c5c-fx2vc\" (UID: \"dfb56860-3914-407b-874f-fced08269626\") " pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:49.232956 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.232875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7vmb\" (UniqueName: \"kubernetes.io/projected/27f39f5f-b64b-482b-b684-9c573028ee21-kube-api-access-h7vmb\") pod \"llmisvc-controller-manager-85ccfc4685-l5fj5\" (UID: \"27f39f5f-b64b-482b-b684-9c573028ee21\") " pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:49.232956 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.232913 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvjsf\" (UniqueName: \"kubernetes.io/projected/dfb56860-3914-407b-874f-fced08269626-kube-api-access-vvjsf\") pod \"seaweedfs-86cc847c5c-fx2vc\" (UID: \"dfb56860-3914-407b-874f-fced08269626\") " pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:49.233181 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.232967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dfb56860-3914-407b-874f-fced08269626-data\") pod \"seaweedfs-86cc847c5c-fx2vc\" (UID: \"dfb56860-3914-407b-874f-fced08269626\") " pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:49.233181 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.233010 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-277gz\" (UniqueName: \"kubernetes.io/projected/bc0414db-20b9-4b50-b5cb-b914c3dd9086-kube-api-access-277gz\") pod \"kserve-controller-manager-84b6647887-j9pnr\" (UID: \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\") " pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:49.233181 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.233154 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27f39f5f-b64b-482b-b684-9c573028ee21-cert\") pod \"llmisvc-controller-manager-85ccfc4685-l5fj5\" (UID: \"27f39f5f-b64b-482b-b684-9c573028ee21\") " pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:49.233338 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.233191 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc0414db-20b9-4b50-b5cb-b914c3dd9086-cert\") pod \"kserve-controller-manager-84b6647887-j9pnr\" (UID: \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\") " pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:49.233451 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.233429 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dfb56860-3914-407b-874f-fced08269626-data\") pod \"seaweedfs-86cc847c5c-fx2vc\" (UID: \"dfb56860-3914-407b-874f-fced08269626\") " pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:49.235755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.235729 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc0414db-20b9-4b50-b5cb-b914c3dd9086-cert\") pod \"kserve-controller-manager-84b6647887-j9pnr\" (UID: \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\") " pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:49.235755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.235736 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27f39f5f-b64b-482b-b684-9c573028ee21-cert\") pod \"llmisvc-controller-manager-85ccfc4685-l5fj5\" (UID: \"27f39f5f-b64b-482b-b684-9c573028ee21\") " pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:49.242455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.242409 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7vmb\" (UniqueName: \"kubernetes.io/projected/27f39f5f-b64b-482b-b684-9c573028ee21-kube-api-access-h7vmb\") pod \"llmisvc-controller-manager-85ccfc4685-l5fj5\" (UID: \"27f39f5f-b64b-482b-b684-9c573028ee21\") " pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:49.244134 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.244107 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvjsf\" (UniqueName: \"kubernetes.io/projected/dfb56860-3914-407b-874f-fced08269626-kube-api-access-vvjsf\") pod \"seaweedfs-86cc847c5c-fx2vc\" (UID: \"dfb56860-3914-407b-874f-fced08269626\") " pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:49.244552 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.244530 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-277gz\" (UniqueName: \"kubernetes.io/projected/bc0414db-20b9-4b50-b5cb-b914c3dd9086-kube-api-access-277gz\") pod \"kserve-controller-manager-84b6647887-j9pnr\" (UID: \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\") " pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:49.298092 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.298055 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:49.323355 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.323325 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:49.352938 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.352905 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:49.466749 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.466704 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-j9pnr"] Apr 24 21:35:49.468550 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:35:49.468523 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc0414db_20b9_4b50_b5cb_b914c3dd9086.slice/crio-14811a3430f36c9186d78769dea3ff8a8aa21ed39faab7ec0072b59f026d99ff WatchSource:0}: Error finding container 14811a3430f36c9186d78769dea3ff8a8aa21ed39faab7ec0072b59f026d99ff: Status 404 returned error can't find the container with id 14811a3430f36c9186d78769dea3ff8a8aa21ed39faab7ec0072b59f026d99ff Apr 24 21:35:49.487093 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.487060 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-85ccfc4685-l5fj5"] Apr 24 21:35:49.488730 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:35:49.488694 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod27f39f5f_b64b_482b_b684_9c573028ee21.slice/crio-5893b4bbbee9c8cdb88bd2a57d371df71d079a17daf490bac92c114a334e0c12 WatchSource:0}: Error finding container 5893b4bbbee9c8cdb88bd2a57d371df71d079a17daf490bac92c114a334e0c12: Status 404 returned error can't find the container with id 5893b4bbbee9c8cdb88bd2a57d371df71d079a17daf490bac92c114a334e0c12 Apr 24 21:35:49.528481 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:49.528454 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-fx2vc"] Apr 24 21:35:49.529202 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:35:49.529177 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfb56860_3914_407b_874f_fced08269626.slice/crio-f6f867ba570fe73b647670254c6700ae708c3c90d31336664cef35ec6e68244f WatchSource:0}: Error finding container f6f867ba570fe73b647670254c6700ae708c3c90d31336664cef35ec6e68244f: Status 404 returned error can't find the container with id f6f867ba570fe73b647670254c6700ae708c3c90d31336664cef35ec6e68244f Apr 24 21:35:50.348353 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:50.348320 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-fx2vc" event={"ID":"dfb56860-3914-407b-874f-fced08269626","Type":"ContainerStarted","Data":"f6f867ba570fe73b647670254c6700ae708c3c90d31336664cef35ec6e68244f"} Apr 24 21:35:50.351904 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:50.351868 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" event={"ID":"27f39f5f-b64b-482b-b684-9c573028ee21","Type":"ContainerStarted","Data":"5893b4bbbee9c8cdb88bd2a57d371df71d079a17daf490bac92c114a334e0c12"} Apr 24 21:35:50.353422 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:50.353393 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" event={"ID":"bc0414db-20b9-4b50-b5cb-b914c3dd9086","Type":"ContainerStarted","Data":"14811a3430f36c9186d78769dea3ff8a8aa21ed39faab7ec0072b59f026d99ff"} Apr 24 21:35:54.372165 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.372104 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" event={"ID":"bc0414db-20b9-4b50-b5cb-b914c3dd9086","Type":"ContainerStarted","Data":"85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3"} Apr 24 21:35:54.372165 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.372166 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:35:54.373623 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.373592 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-fx2vc" event={"ID":"dfb56860-3914-407b-874f-fced08269626","Type":"ContainerStarted","Data":"a25054e7ffdb8913adc56c9fa40972d49f17e32b5c070c5eb80d93c5a6da0897"} Apr 24 21:35:54.373789 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.373714 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:35:54.374925 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.374904 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" event={"ID":"27f39f5f-b64b-482b-b684-9c573028ee21","Type":"ContainerStarted","Data":"4bd950b83cb68c3dd99ce0b96f987e1aeac9d90b7aeb4d24b5d6aa9302eae8b1"} Apr 24 21:35:54.375050 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.375022 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:35:54.390164 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.390118 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" podStartSLOduration=1.955311344 podStartE2EDuration="6.390106s" podCreationTimestamp="2026-04-24 21:35:48 +0000 UTC" firstStartedPulling="2026-04-24 21:35:49.470330296 +0000 UTC m=+560.710683536" lastFinishedPulling="2026-04-24 21:35:53.905124951 +0000 UTC m=+565.145478192" observedRunningTime="2026-04-24 21:35:54.387967123 +0000 UTC m=+565.628320384" watchObservedRunningTime="2026-04-24 21:35:54.390106 +0000 UTC m=+565.630459261" Apr 24 21:35:54.404335 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.404293 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-fx2vc" podStartSLOduration=0.779167414 podStartE2EDuration="5.404279519s" podCreationTimestamp="2026-04-24 21:35:49 +0000 UTC" firstStartedPulling="2026-04-24 21:35:49.530454718 +0000 UTC m=+560.770807957" lastFinishedPulling="2026-04-24 21:35:54.15556682 +0000 UTC m=+565.395920062" observedRunningTime="2026-04-24 21:35:54.403379605 +0000 UTC m=+565.643732866" watchObservedRunningTime="2026-04-24 21:35:54.404279519 +0000 UTC m=+565.644632779" Apr 24 21:35:54.422694 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:35:54.422557 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" podStartSLOduration=1.821580186 podStartE2EDuration="6.422538788s" podCreationTimestamp="2026-04-24 21:35:48 +0000 UTC" firstStartedPulling="2026-04-24 21:35:49.490347153 +0000 UTC m=+560.730700393" lastFinishedPulling="2026-04-24 21:35:54.091305738 +0000 UTC m=+565.331658995" observedRunningTime="2026-04-24 21:35:54.420357502 +0000 UTC m=+565.660710760" watchObservedRunningTime="2026-04-24 21:35:54.422538788 +0000 UTC m=+565.662892049" Apr 24 21:36:00.381880 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:00.381845 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-fx2vc" Apr 24 21:36:25.381837 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:25.381804 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 21:36:25.384890 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:25.384870 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:36:26.882367 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:26.882333 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-j9pnr"] Apr 24 21:36:26.882806 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:26.882532 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" podUID="bc0414db-20b9-4b50-b5cb-b914c3dd9086" containerName="manager" containerID="cri-o://85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3" gracePeriod=10 Apr 24 21:36:26.899930 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:26.899906 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84b6647887-hqblb"] Apr 24 21:36:26.903630 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:26.903613 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:26.909043 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:26.909005 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-hqblb"] Apr 24 21:36:26.947437 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:26.947402 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72jt4\" (UniqueName: \"kubernetes.io/projected/db338508-f5cc-4372-a3b9-52b695a5cea7-kube-api-access-72jt4\") pod \"kserve-controller-manager-84b6647887-hqblb\" (UID: \"db338508-f5cc-4372-a3b9-52b695a5cea7\") " pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:26.947588 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:26.947473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db338508-f5cc-4372-a3b9-52b695a5cea7-cert\") pod \"kserve-controller-manager-84b6647887-hqblb\" (UID: \"db338508-f5cc-4372-a3b9-52b695a5cea7\") " pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:27.048165 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.048131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db338508-f5cc-4372-a3b9-52b695a5cea7-cert\") pod \"kserve-controller-manager-84b6647887-hqblb\" (UID: \"db338508-f5cc-4372-a3b9-52b695a5cea7\") " pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:27.048327 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.048180 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72jt4\" (UniqueName: \"kubernetes.io/projected/db338508-f5cc-4372-a3b9-52b695a5cea7-kube-api-access-72jt4\") pod \"kserve-controller-manager-84b6647887-hqblb\" (UID: \"db338508-f5cc-4372-a3b9-52b695a5cea7\") " pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:27.051009 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.050921 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/db338508-f5cc-4372-a3b9-52b695a5cea7-cert\") pod \"kserve-controller-manager-84b6647887-hqblb\" (UID: \"db338508-f5cc-4372-a3b9-52b695a5cea7\") " pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:27.058760 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.058730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jt4\" (UniqueName: \"kubernetes.io/projected/db338508-f5cc-4372-a3b9-52b695a5cea7-kube-api-access-72jt4\") pod \"kserve-controller-manager-84b6647887-hqblb\" (UID: \"db338508-f5cc-4372-a3b9-52b695a5cea7\") " pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:27.124145 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.124117 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:36:27.249223 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.249193 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc0414db-20b9-4b50-b5cb-b914c3dd9086-cert\") pod \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\" (UID: \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\") " Apr 24 21:36:27.249362 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.249238 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-277gz\" (UniqueName: \"kubernetes.io/projected/bc0414db-20b9-4b50-b5cb-b914c3dd9086-kube-api-access-277gz\") pod \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\" (UID: \"bc0414db-20b9-4b50-b5cb-b914c3dd9086\") " Apr 24 21:36:27.251545 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.251510 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0414db-20b9-4b50-b5cb-b914c3dd9086-cert" (OuterVolumeSpecName: "cert") pod "bc0414db-20b9-4b50-b5cb-b914c3dd9086" (UID: "bc0414db-20b9-4b50-b5cb-b914c3dd9086"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:27.251545 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.251535 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0414db-20b9-4b50-b5cb-b914c3dd9086-kube-api-access-277gz" (OuterVolumeSpecName: "kube-api-access-277gz") pod "bc0414db-20b9-4b50-b5cb-b914c3dd9086" (UID: "bc0414db-20b9-4b50-b5cb-b914c3dd9086"). InnerVolumeSpecName "kube-api-access-277gz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:27.259374 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.259354 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:27.350260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.350234 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc0414db-20b9-4b50-b5cb-b914c3dd9086-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:36:27.350260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.350260 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-277gz\" (UniqueName: \"kubernetes.io/projected/bc0414db-20b9-4b50-b5cb-b914c3dd9086-kube-api-access-277gz\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:36:27.385618 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.385587 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-hqblb"] Apr 24 21:36:27.387061 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:36:27.387028 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb338508_f5cc_4372_a3b9_52b695a5cea7.slice/crio-52e4776d555cdf5c7274b89f367e752c662cf1e33766ddab4b8e45aae8758cad WatchSource:0}: Error finding container 52e4776d555cdf5c7274b89f367e752c662cf1e33766ddab4b8e45aae8758cad: Status 404 returned error can't find the container with id 52e4776d555cdf5c7274b89f367e752c662cf1e33766ddab4b8e45aae8758cad Apr 24 21:36:27.500157 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.500067 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-hqblb" event={"ID":"db338508-f5cc-4372-a3b9-52b695a5cea7","Type":"ContainerStarted","Data":"52e4776d555cdf5c7274b89f367e752c662cf1e33766ddab4b8e45aae8758cad"} Apr 24 21:36:27.501101 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.501076 2569 generic.go:358] "Generic (PLEG): container finished" podID="bc0414db-20b9-4b50-b5cb-b914c3dd9086" containerID="85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3" exitCode=0 Apr 24 21:36:27.501216 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.501108 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" event={"ID":"bc0414db-20b9-4b50-b5cb-b914c3dd9086","Type":"ContainerDied","Data":"85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3"} Apr 24 21:36:27.501216 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.501138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" event={"ID":"bc0414db-20b9-4b50-b5cb-b914c3dd9086","Type":"ContainerDied","Data":"14811a3430f36c9186d78769dea3ff8a8aa21ed39faab7ec0072b59f026d99ff"} Apr 24 21:36:27.501216 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.501144 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-j9pnr" Apr 24 21:36:27.501216 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.501156 2569 scope.go:117] "RemoveContainer" containerID="85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3" Apr 24 21:36:27.509789 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.509766 2569 scope.go:117] "RemoveContainer" containerID="85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3" Apr 24 21:36:27.510047 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:36:27.510029 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3\": container with ID starting with 85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3 not found: ID does not exist" containerID="85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3" Apr 24 21:36:27.510093 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.510056 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3"} err="failed to get container status \"85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3\": rpc error: code = NotFound desc = could not find container \"85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3\": container with ID starting with 85030c426ea73d5271ad3ef7511b9566a0e723bb63efc4a380126f7e2fa8f3f3 not found: ID does not exist" Apr 24 21:36:27.519289 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.519258 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-j9pnr"] Apr 24 21:36:27.523778 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:27.523738 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-j9pnr"] Apr 24 21:36:28.506290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:28.506257 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-hqblb" event={"ID":"db338508-f5cc-4372-a3b9-52b695a5cea7","Type":"ContainerStarted","Data":"bed01312c134652c5466451d7de595c8d5d74fbd68e0da22d453f02413f1755a"} Apr 24 21:36:28.506697 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:28.506399 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:36:28.527212 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:28.527159 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84b6647887-hqblb" podStartSLOduration=2.130864764 podStartE2EDuration="2.527144065s" podCreationTimestamp="2026-04-24 21:36:26 +0000 UTC" firstStartedPulling="2026-04-24 21:36:27.388309841 +0000 UTC m=+598.628663080" lastFinishedPulling="2026-04-24 21:36:27.784589142 +0000 UTC m=+599.024942381" observedRunningTime="2026-04-24 21:36:28.525687391 +0000 UTC m=+599.766040644" watchObservedRunningTime="2026-04-24 21:36:28.527144065 +0000 UTC m=+599.767497325" Apr 24 21:36:29.314242 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:29.314212 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:36:29.314595 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:29.314574 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:36:29.332968 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:29.332935 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0414db-20b9-4b50-b5cb-b914c3dd9086" path="/var/lib/kubelet/pods/bc0414db-20b9-4b50-b5cb-b914c3dd9086/volumes" Apr 24 21:36:59.515198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:36:59.515113 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84b6647887-hqblb" Apr 24 21:37:00.392290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.392247 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-8bb7t"] Apr 24 21:37:00.392794 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.392779 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc0414db-20b9-4b50-b5cb-b914c3dd9086" containerName="manager" Apr 24 21:37:00.392842 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.392798 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0414db-20b9-4b50-b5cb-b914c3dd9086" containerName="manager" Apr 24 21:37:00.392900 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.392889 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc0414db-20b9-4b50-b5cb-b914c3dd9086" containerName="manager" Apr 24 21:37:00.396562 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.396536 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:00.399292 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.399266 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-lv54z\"" Apr 24 21:37:00.400646 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.400625 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:37:00.403501 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.403471 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-q9mzt"] Apr 24 21:37:00.409465 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.409401 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:00.417941 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.411016 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-8bb7t"] Apr 24 21:37:00.417941 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.416038 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:37:00.417941 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.416427 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-4dtpj\"" Apr 24 21:37:00.423764 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.423718 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-q9mzt"] Apr 24 21:37:00.439583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.439535 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/634e9e8a-cb8f-4839-b5a5-8704e670513f-tls-certs\") pod \"model-serving-api-86f7b4b499-8bb7t\" (UID: \"634e9e8a-cb8f-4839-b5a5-8704e670513f\") " pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:00.439783 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.439627 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ltqc\" (UniqueName: \"kubernetes.io/projected/634e9e8a-cb8f-4839-b5a5-8704e670513f-kube-api-access-6ltqc\") pod \"model-serving-api-86f7b4b499-8bb7t\" (UID: \"634e9e8a-cb8f-4839-b5a5-8704e670513f\") " pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:00.439783 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.439659 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fncjm\" (UniqueName: \"kubernetes.io/projected/c41fe313-d889-48bf-b3a0-148f1175b7e0-kube-api-access-fncjm\") pod \"odh-model-controller-696fc77849-q9mzt\" (UID: \"c41fe313-d889-48bf-b3a0-148f1175b7e0\") " pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:00.439783 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.439727 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c41fe313-d889-48bf-b3a0-148f1175b7e0-cert\") pod \"odh-model-controller-696fc77849-q9mzt\" (UID: \"c41fe313-d889-48bf-b3a0-148f1175b7e0\") " pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:00.540917 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.540871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/634e9e8a-cb8f-4839-b5a5-8704e670513f-tls-certs\") pod \"model-serving-api-86f7b4b499-8bb7t\" (UID: \"634e9e8a-cb8f-4839-b5a5-8704e670513f\") " pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:00.541311 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.540970 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ltqc\" (UniqueName: \"kubernetes.io/projected/634e9e8a-cb8f-4839-b5a5-8704e670513f-kube-api-access-6ltqc\") pod \"model-serving-api-86f7b4b499-8bb7t\" (UID: \"634e9e8a-cb8f-4839-b5a5-8704e670513f\") " pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:00.541311 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.541000 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fncjm\" (UniqueName: \"kubernetes.io/projected/c41fe313-d889-48bf-b3a0-148f1175b7e0-kube-api-access-fncjm\") pod \"odh-model-controller-696fc77849-q9mzt\" (UID: \"c41fe313-d889-48bf-b3a0-148f1175b7e0\") " pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:00.541311 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.541040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c41fe313-d889-48bf-b3a0-148f1175b7e0-cert\") pod \"odh-model-controller-696fc77849-q9mzt\" (UID: \"c41fe313-d889-48bf-b3a0-148f1175b7e0\") " pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:00.541311 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:37:00.541179 2569 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 21:37:00.541311 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:37:00.541265 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c41fe313-d889-48bf-b3a0-148f1175b7e0-cert podName:c41fe313-d889-48bf-b3a0-148f1175b7e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:01.041240408 +0000 UTC m=+632.281593649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c41fe313-d889-48bf-b3a0-148f1175b7e0-cert") pod "odh-model-controller-696fc77849-q9mzt" (UID: "c41fe313-d889-48bf-b3a0-148f1175b7e0") : secret "odh-model-controller-webhook-cert" not found Apr 24 21:37:00.543657 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.543633 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/634e9e8a-cb8f-4839-b5a5-8704e670513f-tls-certs\") pod \"model-serving-api-86f7b4b499-8bb7t\" (UID: \"634e9e8a-cb8f-4839-b5a5-8704e670513f\") " pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:00.550912 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.550882 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ltqc\" (UniqueName: \"kubernetes.io/projected/634e9e8a-cb8f-4839-b5a5-8704e670513f-kube-api-access-6ltqc\") pod \"model-serving-api-86f7b4b499-8bb7t\" (UID: \"634e9e8a-cb8f-4839-b5a5-8704e670513f\") " pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:00.551062 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.550888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fncjm\" (UniqueName: \"kubernetes.io/projected/c41fe313-d889-48bf-b3a0-148f1175b7e0-kube-api-access-fncjm\") pod \"odh-model-controller-696fc77849-q9mzt\" (UID: \"c41fe313-d889-48bf-b3a0-148f1175b7e0\") " pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:00.715798 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.715696 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:00.852587 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:00.852559 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-8bb7t"] Apr 24 21:37:00.854236 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:37:00.854208 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod634e9e8a_cb8f_4839_b5a5_8704e670513f.slice/crio-f02a3b4665d3a67c43a6ad55a4a434fe9b22b107808a44a67ec900c5f42ebdc1 WatchSource:0}: Error finding container f02a3b4665d3a67c43a6ad55a4a434fe9b22b107808a44a67ec900c5f42ebdc1: Status 404 returned error can't find the container with id f02a3b4665d3a67c43a6ad55a4a434fe9b22b107808a44a67ec900c5f42ebdc1 Apr 24 21:37:01.045776 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:01.045730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c41fe313-d889-48bf-b3a0-148f1175b7e0-cert\") pod \"odh-model-controller-696fc77849-q9mzt\" (UID: \"c41fe313-d889-48bf-b3a0-148f1175b7e0\") " pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:01.048335 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:01.048313 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c41fe313-d889-48bf-b3a0-148f1175b7e0-cert\") pod \"odh-model-controller-696fc77849-q9mzt\" (UID: \"c41fe313-d889-48bf-b3a0-148f1175b7e0\") " pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:01.336757 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:01.336655 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:01.494319 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:01.494289 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-q9mzt"] Apr 24 21:37:01.603425 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:37:01.603344 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41fe313_d889_48bf_b3a0_148f1175b7e0.slice/crio-a5d23f38a1ca5c2a3846c660477dd3aefd7c2aabddda01c23dce81b97eee9a34 WatchSource:0}: Error finding container a5d23f38a1ca5c2a3846c660477dd3aefd7c2aabddda01c23dce81b97eee9a34: Status 404 returned error can't find the container with id a5d23f38a1ca5c2a3846c660477dd3aefd7c2aabddda01c23dce81b97eee9a34 Apr 24 21:37:01.635504 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:01.635456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-8bb7t" event={"ID":"634e9e8a-cb8f-4839-b5a5-8704e670513f","Type":"ContainerStarted","Data":"f02a3b4665d3a67c43a6ad55a4a434fe9b22b107808a44a67ec900c5f42ebdc1"} Apr 24 21:37:01.636840 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:01.636809 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-q9mzt" event={"ID":"c41fe313-d889-48bf-b3a0-148f1175b7e0","Type":"ContainerStarted","Data":"a5d23f38a1ca5c2a3846c660477dd3aefd7c2aabddda01c23dce81b97eee9a34"} Apr 24 21:37:02.643752 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:02.643711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-8bb7t" event={"ID":"634e9e8a-cb8f-4839-b5a5-8704e670513f","Type":"ContainerStarted","Data":"bb3848c5d968d41fc61623506615bb7ce381d9127a992686aecc509a2c0fd979"} Apr 24 21:37:02.644198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:02.643786 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:02.664123 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:02.664058 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-8bb7t" podStartSLOduration=1.4213161109999999 podStartE2EDuration="2.664039452s" podCreationTimestamp="2026-04-24 21:37:00 +0000 UTC" firstStartedPulling="2026-04-24 21:37:00.856056291 +0000 UTC m=+632.096409529" lastFinishedPulling="2026-04-24 21:37:02.098779615 +0000 UTC m=+633.339132870" observedRunningTime="2026-04-24 21:37:02.662764794 +0000 UTC m=+633.903118056" watchObservedRunningTime="2026-04-24 21:37:02.664039452 +0000 UTC m=+633.904392717" Apr 24 21:37:04.659383 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:04.659265 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-q9mzt" event={"ID":"c41fe313-d889-48bf-b3a0-148f1175b7e0","Type":"ContainerStarted","Data":"99e3e95d6a726d1b90b2f4190f761b76f3df4858cd2d41068e659e3bc0f41445"} Apr 24 21:37:04.659921 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:04.659418 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:04.676800 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:04.676741 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-q9mzt" podStartSLOduration=1.925850385 podStartE2EDuration="4.676722674s" podCreationTimestamp="2026-04-24 21:37:00 +0000 UTC" firstStartedPulling="2026-04-24 21:37:01.60536123 +0000 UTC m=+632.845714474" lastFinishedPulling="2026-04-24 21:37:04.356233517 +0000 UTC m=+635.596586763" observedRunningTime="2026-04-24 21:37:04.67524071 +0000 UTC m=+635.915593986" watchObservedRunningTime="2026-04-24 21:37:04.676722674 +0000 UTC m=+635.917075934" Apr 24 21:37:13.652791 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:13.652761 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-8bb7t" Apr 24 21:37:15.666028 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:15.666000 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-q9mzt" Apr 24 21:37:16.480294 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.480256 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-4zmd8"] Apr 24 21:37:16.483927 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.483908 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-4zmd8" Apr 24 21:37:16.492025 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.491998 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-4zmd8"] Apr 24 21:37:16.576602 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.576566 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnmtn\" (UniqueName: \"kubernetes.io/projected/0eea4ff2-f92f-4c59-8632-2fc7cb0ba868-kube-api-access-xnmtn\") pod \"s3-init-4zmd8\" (UID: \"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868\") " pod="kserve/s3-init-4zmd8" Apr 24 21:37:16.677353 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.677316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnmtn\" (UniqueName: \"kubernetes.io/projected/0eea4ff2-f92f-4c59-8632-2fc7cb0ba868-kube-api-access-xnmtn\") pod \"s3-init-4zmd8\" (UID: \"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868\") " pod="kserve/s3-init-4zmd8" Apr 24 21:37:16.686647 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.686615 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnmtn\" (UniqueName: \"kubernetes.io/projected/0eea4ff2-f92f-4c59-8632-2fc7cb0ba868-kube-api-access-xnmtn\") pod \"s3-init-4zmd8\" (UID: \"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868\") " pod="kserve/s3-init-4zmd8" Apr 24 21:37:16.793987 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.793962 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-4zmd8" Apr 24 21:37:16.922291 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.922264 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-4zmd8"] Apr 24 21:37:16.923641 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:37:16.923618 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eea4ff2_f92f_4c59_8632_2fc7cb0ba868.slice/crio-f177f36044f8f4c2ab21edbed276a0044bfdbaef0d312a49095a5e702e74e003 WatchSource:0}: Error finding container f177f36044f8f4c2ab21edbed276a0044bfdbaef0d312a49095a5e702e74e003: Status 404 returned error can't find the container with id f177f36044f8f4c2ab21edbed276a0044bfdbaef0d312a49095a5e702e74e003 Apr 24 21:37:16.925582 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:16.925562 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:37:17.711837 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:17.711791 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-4zmd8" event={"ID":"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868","Type":"ContainerStarted","Data":"f177f36044f8f4c2ab21edbed276a0044bfdbaef0d312a49095a5e702e74e003"} Apr 24 21:37:21.729296 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:21.729193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-4zmd8" event={"ID":"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868","Type":"ContainerStarted","Data":"c5a386287eb0972860b44cc7124f24c2326d613601c2a209512c834f7db231c9"} Apr 24 21:37:21.747877 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:21.747827 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-4zmd8" podStartSLOduration=1.29178676 podStartE2EDuration="5.747813895s" podCreationTimestamp="2026-04-24 21:37:16 +0000 UTC" firstStartedPulling="2026-04-24 21:37:16.925782847 +0000 UTC m=+648.166136091" lastFinishedPulling="2026-04-24 21:37:21.381809973 +0000 UTC m=+652.622163226" observedRunningTime="2026-04-24 21:37:21.745753376 +0000 UTC m=+652.986106638" watchObservedRunningTime="2026-04-24 21:37:21.747813895 +0000 UTC m=+652.988167155" Apr 24 21:37:24.742803 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:24.742763 2569 generic.go:358] "Generic (PLEG): container finished" podID="0eea4ff2-f92f-4c59-8632-2fc7cb0ba868" containerID="c5a386287eb0972860b44cc7124f24c2326d613601c2a209512c834f7db231c9" exitCode=0 Apr 24 21:37:24.743159 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:24.742838 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-4zmd8" event={"ID":"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868","Type":"ContainerDied","Data":"c5a386287eb0972860b44cc7124f24c2326d613601c2a209512c834f7db231c9"} Apr 24 21:37:25.875208 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:25.875182 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-4zmd8" Apr 24 21:37:25.961485 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:25.961444 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnmtn\" (UniqueName: \"kubernetes.io/projected/0eea4ff2-f92f-4c59-8632-2fc7cb0ba868-kube-api-access-xnmtn\") pod \"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868\" (UID: \"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868\") " Apr 24 21:37:25.963714 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:25.963691 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eea4ff2-f92f-4c59-8632-2fc7cb0ba868-kube-api-access-xnmtn" (OuterVolumeSpecName: "kube-api-access-xnmtn") pod "0eea4ff2-f92f-4c59-8632-2fc7cb0ba868" (UID: "0eea4ff2-f92f-4c59-8632-2fc7cb0ba868"). InnerVolumeSpecName "kube-api-access-xnmtn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:26.062973 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:26.062935 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnmtn\" (UniqueName: \"kubernetes.io/projected/0eea4ff2-f92f-4c59-8632-2fc7cb0ba868-kube-api-access-xnmtn\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:37:26.752164 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:26.752126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-4zmd8" event={"ID":"0eea4ff2-f92f-4c59-8632-2fc7cb0ba868","Type":"ContainerDied","Data":"f177f36044f8f4c2ab21edbed276a0044bfdbaef0d312a49095a5e702e74e003"} Apr 24 21:37:26.752164 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:26.752150 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-4zmd8" Apr 24 21:37:26.752164 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:26.752167 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f177f36044f8f4c2ab21edbed276a0044bfdbaef0d312a49095a5e702e74e003" Apr 24 21:37:37.052290 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.052248 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx"] Apr 24 21:37:37.052771 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.052720 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0eea4ff2-f92f-4c59-8632-2fc7cb0ba868" containerName="s3-init" Apr 24 21:37:37.052771 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.052736 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eea4ff2-f92f-4c59-8632-2fc7cb0ba868" containerName="s3-init" Apr 24 21:37:37.052856 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.052799 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0eea4ff2-f92f-4c59-8632-2fc7cb0ba868" containerName="s3-init" Apr 24 21:37:37.059422 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.059395 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.062663 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.062638 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 24 21:37:37.062827 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.062753 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:37:37.063107 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.063087 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-mgqsb\"" Apr 24 21:37:37.063195 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.063096 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:37:37.076000 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.075973 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx"] Apr 24 21:37:37.160863 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.160822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.161063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.160870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.161063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.160914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.161063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.160969 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7321a418-5367-453b-83e2-4f814b7bfcb0-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.161063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.160995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.161063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.161013 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.161063 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.161057 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.161286 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.161078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.161286 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.161110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzwxn\" (UniqueName: \"kubernetes.io/projected/7321a418-5367-453b-83e2-4f814b7bfcb0-kube-api-access-tzwxn\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.261907 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.261863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzwxn\" (UniqueName: \"kubernetes.io/projected/7321a418-5367-453b-83e2-4f814b7bfcb0-kube-api-access-tzwxn\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262114 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.261954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262114 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.261991 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262114 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262019 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262114 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262072 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7321a418-5367-453b-83e2-4f814b7bfcb0-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262114 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262379 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262141 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262379 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262379 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262736 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262708 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.262892 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262782 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.263024 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.262837 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.263345 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.263319 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.263467 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.263405 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7321a418-5367-453b-83e2-4f814b7bfcb0-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.264970 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.264951 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.265216 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.265194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.278590 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.278559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzwxn\" (UniqueName: \"kubernetes.io/projected/7321a418-5367-453b-83e2-4f814b7bfcb0-kube-api-access-tzwxn\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.285525 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.285495 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7321a418-5367-453b-83e2-4f814b7bfcb0-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-b9jxx\" (UID: \"7321a418-5367-453b-83e2-4f814b7bfcb0\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.370276 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.370166 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:37.529248 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.529213 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx"] Apr 24 21:37:37.529573 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:37:37.529547 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7321a418_5367_453b_83e2_4f814b7bfcb0.slice/crio-279c07528f79399696730b5d0848fc00a1c454ed9a5574f102e40868d30cb44b WatchSource:0}: Error finding container 279c07528f79399696730b5d0848fc00a1c454ed9a5574f102e40868d30cb44b: Status 404 returned error can't find the container with id 279c07528f79399696730b5d0848fc00a1c454ed9a5574f102e40868d30cb44b Apr 24 21:37:37.531867 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.531833 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 24 21:37:37.531971 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.531913 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 24 21:37:37.531971 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.531958 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 24 21:37:37.793750 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.793709 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" event={"ID":"7321a418-5367-453b-83e2-4f814b7bfcb0","Type":"ContainerStarted","Data":"50dd5ef53ac4e0ac65683edb0777e80c1ef5356ed693c93261857e88d5dd014e"} Apr 24 21:37:37.793750 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.793755 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" event={"ID":"7321a418-5367-453b-83e2-4f814b7bfcb0","Type":"ContainerStarted","Data":"279c07528f79399696730b5d0848fc00a1c454ed9a5574f102e40868d30cb44b"} Apr 24 21:37:37.824074 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:37.824017 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" podStartSLOduration=0.824001863 podStartE2EDuration="824.001863ms" podCreationTimestamp="2026-04-24 21:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:37:37.822285641 +0000 UTC m=+669.062638903" watchObservedRunningTime="2026-04-24 21:37:37.824001863 +0000 UTC m=+669.064355137" Apr 24 21:37:38.370545 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:38.370512 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:38.376019 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:38.375988 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:38.798622 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:38.798583 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:38.799623 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:38.799601 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-b9jxx" Apr 24 21:37:41.681546 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.681506 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz"] Apr 24 21:37:41.685252 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.685218 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.689142 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.689118 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec0c69dceeb48768325d1a53a749e65786-kserve-self-signed-certs\"" Apr 24 21:37:41.689283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.689171 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wjq2p\"" Apr 24 21:37:41.698133 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.698108 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz"] Apr 24 21:37:41.805314 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.805265 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.805543 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.805338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.805543 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.805382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.805543 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.805409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.805543 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.805459 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.805543 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.805528 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e226eb9b-5238-4731-8c52-176a51777809-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.805838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.805580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4ft\" (UniqueName: \"kubernetes.io/projected/e226eb9b-5238-4731-8c52-176a51777809-kube-api-access-2k4ft\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906212 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906409 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906225 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906409 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906409 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906409 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906301 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e226eb9b-5238-4731-8c52-176a51777809-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906629 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906436 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4ft\" (UniqueName: \"kubernetes.io/projected/e226eb9b-5238-4731-8c52-176a51777809-kube-api-access-2k4ft\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906629 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906772 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906814 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.906948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.906932 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.908771 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.908748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.909073 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.909053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e226eb9b-5238-4731-8c52-176a51777809-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.929410 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.929377 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4ft\" (UniqueName: \"kubernetes.io/projected/e226eb9b-5238-4731-8c52-176a51777809-kube-api-access-2k4ft\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:41.998202 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:41.998167 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:37:42.143345 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:42.143310 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz"] Apr 24 21:37:42.146889 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:37:42.146856 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode226eb9b_5238_4731_8c52_176a51777809.slice/crio-b7ae54ddd1077d832b4c2a79775cde10ac4ddb372b8301c5c5799e87949d5c67 WatchSource:0}: Error finding container b7ae54ddd1077d832b4c2a79775cde10ac4ddb372b8301c5c5799e87949d5c67: Status 404 returned error can't find the container with id b7ae54ddd1077d832b4c2a79775cde10ac4ddb372b8301c5c5799e87949d5c67 Apr 24 21:37:42.815930 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:42.815888 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" event={"ID":"e226eb9b-5238-4731-8c52-176a51777809","Type":"ContainerStarted","Data":"b7ae54ddd1077d832b4c2a79775cde10ac4ddb372b8301c5c5799e87949d5c67"} Apr 24 21:37:45.864369 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:45.864317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" event={"ID":"e226eb9b-5238-4731-8c52-176a51777809","Type":"ContainerStarted","Data":"fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154"} Apr 24 21:37:49.761731 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.761697 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297"] Apr 24 21:37:49.767276 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.767249 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.773249 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.773221 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 24 21:37:49.773392 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.773221 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-gql2z\"" Apr 24 21:37:49.783819 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.783788 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297"] Apr 24 21:37:49.802888 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.802853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.803033 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.802904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.803033 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.802982 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.803033 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.803029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.803219 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.803056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e76fa608-77f5-48a2-8029-94db346c054a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.803219 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.803082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rw9w\" (UniqueName: \"kubernetes.io/projected/e76fa608-77f5-48a2-8029-94db346c054a-kube-api-access-8rw9w\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904091 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904279 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904279 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904137 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e76fa608-77f5-48a2-8029-94db346c054a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904279 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rw9w\" (UniqueName: \"kubernetes.io/projected/e76fa608-77f5-48a2-8029-94db346c054a-kube-api-access-8rw9w\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904279 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904279 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904606 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904544 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904606 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904544 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904860 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904607 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.904860 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.904762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.907018 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.906998 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e76fa608-77f5-48a2-8029-94db346c054a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:49.918051 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:49.918020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rw9w\" (UniqueName: \"kubernetes.io/projected/e76fa608-77f5-48a2-8029-94db346c054a-kube-api-access-8rw9w\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:50.077199 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:50.077108 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:37:50.230196 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:50.230167 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297"] Apr 24 21:37:50.232213 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:37:50.232181 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76fa608_77f5_48a2_8029_94db346c054a.slice/crio-5e3af4251201edcf70c28d07c508f0838ebf2f919a45c1888d65c327e76124f2 WatchSource:0}: Error finding container 5e3af4251201edcf70c28d07c508f0838ebf2f919a45c1888d65c327e76124f2: Status 404 returned error can't find the container with id 5e3af4251201edcf70c28d07c508f0838ebf2f919a45c1888d65c327e76124f2 Apr 24 21:37:50.889861 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:50.889824 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" event={"ID":"e76fa608-77f5-48a2-8029-94db346c054a","Type":"ContainerStarted","Data":"91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1"} Apr 24 21:37:50.889861 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:50.889864 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" event={"ID":"e76fa608-77f5-48a2-8029-94db346c054a","Type":"ContainerStarted","Data":"5e3af4251201edcf70c28d07c508f0838ebf2f919a45c1888d65c327e76124f2"} Apr 24 21:37:51.379508 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:51.379470 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz"] Apr 24 21:37:51.379828 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:51.379793 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" podUID="e226eb9b-5238-4731-8c52-176a51777809" containerName="storage-initializer" containerID="cri-o://fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154" gracePeriod=30 Apr 24 21:37:51.896550 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:51.896515 2569 generic.go:358] "Generic (PLEG): container finished" podID="e76fa608-77f5-48a2-8029-94db346c054a" containerID="91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1" exitCode=0 Apr 24 21:37:51.897009 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:51.896586 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" event={"ID":"e76fa608-77f5-48a2-8029-94db346c054a","Type":"ContainerDied","Data":"91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1"} Apr 24 21:37:53.908411 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:37:53.908370 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" event={"ID":"e76fa608-77f5-48a2-8029-94db346c054a","Type":"ContainerStarted","Data":"6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667"} Apr 24 21:38:23.438851 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.438823 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz_e226eb9b-5238-4731-8c52-176a51777809/storage-initializer/0.log" Apr 24 21:38:23.439269 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.438910 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:38:23.539787 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.539749 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-tmp-dir\") pod \"e226eb9b-5238-4731-8c52-176a51777809\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " Apr 24 21:38:23.539991 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.539861 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e226eb9b-5238-4731-8c52-176a51777809-tls-certs\") pod \"e226eb9b-5238-4731-8c52-176a51777809\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " Apr 24 21:38:23.539991 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.539906 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k4ft\" (UniqueName: \"kubernetes.io/projected/e226eb9b-5238-4731-8c52-176a51777809-kube-api-access-2k4ft\") pod \"e226eb9b-5238-4731-8c52-176a51777809\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " Apr 24 21:38:23.539991 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.539948 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-model-cache\") pod \"e226eb9b-5238-4731-8c52-176a51777809\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " Apr 24 21:38:23.539991 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.539971 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-dshm\") pod \"e226eb9b-5238-4731-8c52-176a51777809\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " Apr 24 21:38:23.540192 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.540032 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-home\") pod \"e226eb9b-5238-4731-8c52-176a51777809\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " Apr 24 21:38:23.540192 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.540066 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-kserve-provision-location\") pod \"e226eb9b-5238-4731-8c52-176a51777809\" (UID: \"e226eb9b-5238-4731-8c52-176a51777809\") " Apr 24 21:38:23.540192 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.540074 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "e226eb9b-5238-4731-8c52-176a51777809" (UID: "e226eb9b-5238-4731-8c52-176a51777809"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:23.540335 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.540251 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-model-cache" (OuterVolumeSpecName: "model-cache") pod "e226eb9b-5238-4731-8c52-176a51777809" (UID: "e226eb9b-5238-4731-8c52-176a51777809"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:23.540401 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.540360 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-home" (OuterVolumeSpecName: "home") pod "e226eb9b-5238-4731-8c52-176a51777809" (UID: "e226eb9b-5238-4731-8c52-176a51777809"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:23.540493 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.540471 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-model-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:38:23.540493 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.540493 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-home\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:38:23.540874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.540506 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-tmp-dir\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:38:23.542510 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.542480 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e226eb9b-5238-4731-8c52-176a51777809-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e226eb9b-5238-4731-8c52-176a51777809" (UID: "e226eb9b-5238-4731-8c52-176a51777809"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:38:23.542731 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.542703 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-dshm" (OuterVolumeSpecName: "dshm") pod "e226eb9b-5238-4731-8c52-176a51777809" (UID: "e226eb9b-5238-4731-8c52-176a51777809"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:23.542899 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.542878 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e226eb9b-5238-4731-8c52-176a51777809-kube-api-access-2k4ft" (OuterVolumeSpecName: "kube-api-access-2k4ft") pod "e226eb9b-5238-4731-8c52-176a51777809" (UID: "e226eb9b-5238-4731-8c52-176a51777809"). InnerVolumeSpecName "kube-api-access-2k4ft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:38:23.589652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.589604 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e226eb9b-5238-4731-8c52-176a51777809" (UID: "e226eb9b-5238-4731-8c52-176a51777809"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:23.641491 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.641450 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e226eb9b-5238-4731-8c52-176a51777809-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:38:23.641491 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.641484 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2k4ft\" (UniqueName: \"kubernetes.io/projected/e226eb9b-5238-4731-8c52-176a51777809-kube-api-access-2k4ft\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:38:23.641491 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.641495 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-dshm\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:38:23.641754 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:23.641504 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e226eb9b-5238-4731-8c52-176a51777809-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:38:24.051421 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.051390 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz_e226eb9b-5238-4731-8c52-176a51777809/storage-initializer/0.log" Apr 24 21:38:24.051617 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.051439 2569 generic.go:358] "Generic (PLEG): container finished" podID="e226eb9b-5238-4731-8c52-176a51777809" containerID="fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154" exitCode=137 Apr 24 21:38:24.051617 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.051524 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" Apr 24 21:38:24.051617 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.051541 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" event={"ID":"e226eb9b-5238-4731-8c52-176a51777809","Type":"ContainerDied","Data":"fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154"} Apr 24 21:38:24.051617 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.051600 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz" event={"ID":"e226eb9b-5238-4731-8c52-176a51777809","Type":"ContainerDied","Data":"b7ae54ddd1077d832b4c2a79775cde10ac4ddb372b8301c5c5799e87949d5c67"} Apr 24 21:38:24.051887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.051621 2569 scope.go:117] "RemoveContainer" containerID="fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154" Apr 24 21:38:24.091267 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.091234 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz"] Apr 24 21:38:24.094997 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.094969 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-6b95c48b666jldz"] Apr 24 21:38:24.282879 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.282720 2569 scope.go:117] "RemoveContainer" containerID="fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154" Apr 24 21:38:24.283082 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:38:24.283063 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154\": container with ID starting with fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154 not found: ID does not exist" containerID="fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154" Apr 24 21:38:24.283126 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:24.283091 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154"} err="failed to get container status \"fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154\": rpc error: code = NotFound desc = could not find container \"fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154\": container with ID starting with fb9af68ffc497229f6fdb45ae0c2795a60f4467230db9674a1e5ed286f0a7154 not found: ID does not exist" Apr 24 21:38:25.058984 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.058939 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" event={"ID":"e76fa608-77f5-48a2-8029-94db346c054a","Type":"ContainerStarted","Data":"db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2"} Apr 24 21:38:25.059456 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.059182 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:38:25.062082 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.062059 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:38:25.084000 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.083939 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" podStartSLOduration=3.50700763 podStartE2EDuration="36.083921945s" podCreationTimestamp="2026-04-24 21:37:49 +0000 UTC" firstStartedPulling="2026-04-24 21:37:51.897860974 +0000 UTC m=+683.138214212" lastFinishedPulling="2026-04-24 21:38:24.474775276 +0000 UTC m=+715.715128527" observedRunningTime="2026-04-24 21:38:25.081408567 +0000 UTC m=+716.321761841" watchObservedRunningTime="2026-04-24 21:38:25.083921945 +0000 UTC m=+716.324275233" Apr 24 21:38:25.332523 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.332439 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e226eb9b-5238-4731-8c52-176a51777809" path="/var/lib/kubelet/pods/e226eb9b-5238-4731-8c52-176a51777809/volumes" Apr 24 21:38:25.932806 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.932766 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg"] Apr 24 21:38:25.933235 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.933217 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e226eb9b-5238-4731-8c52-176a51777809" containerName="storage-initializer" Apr 24 21:38:25.933317 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.933238 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e226eb9b-5238-4731-8c52-176a51777809" containerName="storage-initializer" Apr 24 21:38:25.933371 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.933316 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e226eb9b-5238-4731-8c52-176a51777809" containerName="storage-initializer" Apr 24 21:38:25.990472 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.990423 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg"] Apr 24 21:38:25.990646 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.990563 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:25.993461 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:25.993435 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 24 21:38:26.164583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.164542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.165049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.164599 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.165049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.164635 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.165049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.164701 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.165049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.164804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhq2t\" (UniqueName: \"kubernetes.io/projected/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kube-api-access-xhq2t\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.165049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.164845 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.165049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.164920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.266406 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.266358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.266624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.266422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.266624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.266454 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.266624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.266505 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.266624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.266554 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhq2t\" (UniqueName: \"kubernetes.io/projected/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kube-api-access-xhq2t\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.266624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.266592 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.266624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.266621 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.266984 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.266858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.267088 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.267062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.267153 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.267086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.267206 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.267165 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.268981 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.268950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.269349 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.269320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.274984 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.274958 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhq2t\" (UniqueName: \"kubernetes.io/projected/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kube-api-access-xhq2t\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.301966 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.301924 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:38:26.451569 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:26.451534 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg"] Apr 24 21:38:26.453751 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:38:26.453718 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5cc5bdd_e4fa_448b_9e4f_66c01547116e.slice/crio-5fda63854fcae64b81d6ae4124b6db9e35be20eb68c6d2a57a9ed2a26ca81a5a WatchSource:0}: Error finding container 5fda63854fcae64b81d6ae4124b6db9e35be20eb68c6d2a57a9ed2a26ca81a5a: Status 404 returned error can't find the container with id 5fda63854fcae64b81d6ae4124b6db9e35be20eb68c6d2a57a9ed2a26ca81a5a Apr 24 21:38:27.069017 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:27.068965 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" event={"ID":"b5cc5bdd-e4fa-448b-9e4f-66c01547116e","Type":"ContainerStarted","Data":"79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c"} Apr 24 21:38:27.069210 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:27.069029 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" event={"ID":"b5cc5bdd-e4fa-448b-9e4f-66c01547116e","Type":"ContainerStarted","Data":"5fda63854fcae64b81d6ae4124b6db9e35be20eb68c6d2a57a9ed2a26ca81a5a"} Apr 24 21:38:30.077455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:30.077412 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:38:30.078122 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:30.077602 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:38:30.078122 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:30.077954 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.49:8082/healthz\": dial tcp 10.133.0.49:8082: connect: connection refused" Apr 24 21:38:40.079488 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:40.079453 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:38:40.080635 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:38:40.080614 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:40:11.501718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:11.501646 2569 generic.go:358] "Generic (PLEG): container finished" podID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" containerID="79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c" exitCode=0 Apr 24 21:40:11.502141 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:11.501725 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" event={"ID":"b5cc5bdd-e4fa-448b-9e4f-66c01547116e","Type":"ContainerDied","Data":"79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c"} Apr 24 21:40:13.511199 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:13.511162 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" event={"ID":"b5cc5bdd-e4fa-448b-9e4f-66c01547116e","Type":"ContainerStarted","Data":"ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e"} Apr 24 21:40:13.532575 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:13.532520 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" podStartSLOduration=107.486014548 podStartE2EDuration="1m48.532506258s" podCreationTimestamp="2026-04-24 21:38:25 +0000 UTC" firstStartedPulling="2026-04-24 21:40:11.502963797 +0000 UTC m=+822.743317036" lastFinishedPulling="2026-04-24 21:40:12.549455504 +0000 UTC m=+823.789808746" observedRunningTime="2026-04-24 21:40:13.529796155 +0000 UTC m=+824.770149416" watchObservedRunningTime="2026-04-24 21:40:13.532506258 +0000 UTC m=+824.772859519" Apr 24 21:40:16.302807 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:16.302767 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:40:16.302807 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:16.302817 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:40:16.315729 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:16.315695 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:40:16.534947 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:16.534907 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:40:17.202411 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:17.202375 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg"] Apr 24 21:40:17.624626 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:40:17.624593 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs: secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 24 21:40:17.625038 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:40:17.624693 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs podName:b5cc5bdd-e4fa-448b-9e4f-66c01547116e nodeName:}" failed. No retries permitted until 2026-04-24 21:40:18.124649667 +0000 UTC m=+829.365002910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs") pod "llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e") : secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 24 21:40:18.128526 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:40:18.128492 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs: secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 24 21:40:18.128746 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:40:18.128573 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs podName:b5cc5bdd-e4fa-448b-9e4f-66c01547116e nodeName:}" failed. No retries permitted until 2026-04-24 21:40:19.128557219 +0000 UTC m=+830.368910458 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs") pod "llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e") : secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 24 21:40:18.530238 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.530190 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" podUID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" containerName="main" containerID="cri-o://ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e" gracePeriod=30 Apr 24 21:40:18.787496 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.787430 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:40:18.935296 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935263 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs\") pod \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " Apr 24 21:40:18.935495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935315 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tmp-dir\") pod \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " Apr 24 21:40:18.935495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935345 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhq2t\" (UniqueName: \"kubernetes.io/projected/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kube-api-access-xhq2t\") pod \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " Apr 24 21:40:18.935495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935401 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kserve-provision-location\") pod \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " Apr 24 21:40:18.935495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935442 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-model-cache\") pod \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " Apr 24 21:40:18.935495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935472 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-dshm\") pod \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " Apr 24 21:40:18.935767 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935497 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-home\") pod \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\" (UID: \"b5cc5bdd-e4fa-448b-9e4f-66c01547116e\") " Apr 24 21:40:18.935767 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935652 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "b5cc5bdd-e4fa-448b-9e4f-66c01547116e" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:18.935875 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935843 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-model-cache" (OuterVolumeSpecName: "model-cache") pod "b5cc5bdd-e4fa-448b-9e4f-66c01547116e" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:18.935928 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935868 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tmp-dir\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:40:18.935928 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.935910 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-home" (OuterVolumeSpecName: "home") pod "b5cc5bdd-e4fa-448b-9e4f-66c01547116e" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:18.937810 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.937785 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b5cc5bdd-e4fa-448b-9e4f-66c01547116e" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:18.937940 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.937819 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kube-api-access-xhq2t" (OuterVolumeSpecName: "kube-api-access-xhq2t") pod "b5cc5bdd-e4fa-448b-9e4f-66c01547116e" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e"). InnerVolumeSpecName "kube-api-access-xhq2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:18.937940 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.937873 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-dshm" (OuterVolumeSpecName: "dshm") pod "b5cc5bdd-e4fa-448b-9e4f-66c01547116e" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:18.997651 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:18.997605 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5cc5bdd-e4fa-448b-9e4f-66c01547116e" (UID: "b5cc5bdd-e4fa-448b-9e4f-66c01547116e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:19.036359 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.036326 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:40:19.036359 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.036359 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xhq2t\" (UniqueName: \"kubernetes.io/projected/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kube-api-access-xhq2t\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:40:19.036563 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.036373 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:40:19.036563 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.036386 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-model-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:40:19.036563 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.036396 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-dshm\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:40:19.036563 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.036403 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5cc5bdd-e4fa-448b-9e4f-66c01547116e-home\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:40:19.535689 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.535633 2569 generic.go:358] "Generic (PLEG): container finished" podID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" containerID="ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e" exitCode=0 Apr 24 21:40:19.535887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.535721 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" event={"ID":"b5cc5bdd-e4fa-448b-9e4f-66c01547116e","Type":"ContainerDied","Data":"ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e"} Apr 24 21:40:19.535887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.535765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" event={"ID":"b5cc5bdd-e4fa-448b-9e4f-66c01547116e","Type":"ContainerDied","Data":"5fda63854fcae64b81d6ae4124b6db9e35be20eb68c6d2a57a9ed2a26ca81a5a"} Apr 24 21:40:19.535887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.535769 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg" Apr 24 21:40:19.535887 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.535780 2569 scope.go:117] "RemoveContainer" containerID="ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e" Apr 24 21:40:19.545427 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.545404 2569 scope.go:117] "RemoveContainer" containerID="79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c" Apr 24 21:40:19.569689 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.569637 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg"] Apr 24 21:40:19.575108 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.575078 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-79d597857chcpkg"] Apr 24 21:40:19.610205 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.610184 2569 scope.go:117] "RemoveContainer" containerID="ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e" Apr 24 21:40:19.610563 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:40:19.610539 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e\": container with ID starting with ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e not found: ID does not exist" containerID="ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e" Apr 24 21:40:19.610640 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.610579 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e"} err="failed to get container status \"ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e\": rpc error: code = NotFound desc = could not find container \"ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e\": container with ID starting with ada14501805629157db18684b377e252d9e8230294bbc1eb0dd2bf3074e1238e not found: ID does not exist" Apr 24 21:40:19.610640 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.610608 2569 scope.go:117] "RemoveContainer" containerID="79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c" Apr 24 21:40:19.610960 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:40:19.610932 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c\": container with ID starting with 79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c not found: ID does not exist" containerID="79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c" Apr 24 21:40:19.611005 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:19.610968 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c"} err="failed to get container status \"79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c\": rpc error: code = NotFound desc = could not find container \"79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c\": container with ID starting with 79b8e8e7841e37fb8f7191f839326cfe0d79d2935877bfee3c366ad4f803e84c not found: ID does not exist" Apr 24 21:40:21.333046 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.333010 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" path="/var/lib/kubelet/pods/b5cc5bdd-e4fa-448b-9e4f-66c01547116e/volumes" Apr 24 21:40:21.446640 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.446601 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs"] Apr 24 21:40:21.447042 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.447017 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" containerName="storage-initializer" Apr 24 21:40:21.447144 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.447055 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" containerName="storage-initializer" Apr 24 21:40:21.447144 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.447078 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" containerName="main" Apr 24 21:40:21.447144 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.447085 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" containerName="main" Apr 24 21:40:21.447249 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.447155 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5cc5bdd-e4fa-448b-9e4f-66c01547116e" containerName="main" Apr 24 21:40:21.452546 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.452523 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.455321 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.455301 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-s2h6t\"" Apr 24 21:40:21.455460 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.455445 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 24 21:40:21.466702 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.466658 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs"] Apr 24 21:40:21.556526 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.556480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwr4k\" (UniqueName: \"kubernetes.io/projected/2b645dad-26c4-4289-95de-621be064af11-kube-api-access-wwr4k\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.556526 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.556529 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.556804 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.556619 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.556804 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.556653 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.556804 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.556724 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.556917 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.556804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b645dad-26c4-4289-95de-621be064af11-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.657665 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.657552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b645dad-26c4-4289-95de-621be064af11-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.657665 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.657616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwr4k\" (UniqueName: \"kubernetes.io/projected/2b645dad-26c4-4289-95de-621be064af11-kube-api-access-wwr4k\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.657665 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.657639 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.657988 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.657703 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.657988 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.657721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.657988 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.657764 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.658204 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.658177 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.658294 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.658208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.658294 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.658244 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.658398 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.658312 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.660536 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.660510 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b645dad-26c4-4289-95de-621be064af11-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.666241 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.666218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwr4k\" (UniqueName: \"kubernetes.io/projected/2b645dad-26c4-4289-95de-621be064af11-kube-api-access-wwr4k\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.762271 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.762232 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:21.903580 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:21.903550 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs"] Apr 24 21:40:21.905309 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:40:21.905281 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b645dad_26c4_4289_95de_621be064af11.slice/crio-9edfb3f916c616162ab585757e1a291519b47e568ccf109b21ebd3471badf47a WatchSource:0}: Error finding container 9edfb3f916c616162ab585757e1a291519b47e568ccf109b21ebd3471badf47a: Status 404 returned error can't find the container with id 9edfb3f916c616162ab585757e1a291519b47e568ccf109b21ebd3471badf47a Apr 24 21:40:22.550394 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:22.550355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" event={"ID":"2b645dad-26c4-4289-95de-621be064af11","Type":"ContainerStarted","Data":"96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969"} Apr 24 21:40:22.550830 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:22.550401 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" event={"ID":"2b645dad-26c4-4289-95de-621be064af11","Type":"ContainerStarted","Data":"9edfb3f916c616162ab585757e1a291519b47e568ccf109b21ebd3471badf47a"} Apr 24 21:40:23.556032 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:23.555991 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b645dad-26c4-4289-95de-621be064af11" containerID="96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969" exitCode=0 Apr 24 21:40:23.556528 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:23.556073 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" event={"ID":"2b645dad-26c4-4289-95de-621be064af11","Type":"ContainerDied","Data":"96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969"} Apr 24 21:40:24.562327 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:24.562274 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" event={"ID":"2b645dad-26c4-4289-95de-621be064af11","Type":"ContainerStarted","Data":"e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd"} Apr 24 21:40:24.562327 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:24.562335 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" event={"ID":"2b645dad-26c4-4289-95de-621be064af11","Type":"ContainerStarted","Data":"e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2"} Apr 24 21:40:24.562800 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:24.562386 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:31.762623 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:31.762576 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:31.762623 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:31.762624 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:31.765467 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:31.765441 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:31.802471 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:31.802418 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" podStartSLOduration=10.802402988 podStartE2EDuration="10.802402988s" podCreationTimestamp="2026-04-24 21:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:24.588210914 +0000 UTC m=+835.828564175" watchObservedRunningTime="2026-04-24 21:40:31.802402988 +0000 UTC m=+843.042756248" Apr 24 21:40:32.600739 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:32.600712 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:40:53.605435 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:40:53.605402 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:41:29.347413 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:41:29.347384 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:41:29.348963 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:41:29.348940 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:42:37.514547 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:37.514507 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs"] Apr 24 21:42:37.515108 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:37.514839 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="main" containerID="cri-o://e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2" gracePeriod=30 Apr 24 21:42:37.515108 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:37.514908 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="tokenizer" containerID="cri-o://e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd" gracePeriod=30 Apr 24 21:42:38.105099 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.105063 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b645dad-26c4-4289-95de-621be064af11" containerID="e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2" exitCode=0 Apr 24 21:42:38.105284 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.105140 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" event={"ID":"2b645dad-26c4-4289-95de-621be064af11","Type":"ContainerDied","Data":"e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2"} Apr 24 21:42:38.874269 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.874241 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:42:38.971398 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.971290 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-tmp\") pod \"2b645dad-26c4-4289-95de-621be064af11\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " Apr 24 21:42:38.971398 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.971340 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-cache\") pod \"2b645dad-26c4-4289-95de-621be064af11\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " Apr 24 21:42:38.971398 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.971380 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-uds\") pod \"2b645dad-26c4-4289-95de-621be064af11\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " Apr 24 21:42:38.971718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.971414 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwr4k\" (UniqueName: \"kubernetes.io/projected/2b645dad-26c4-4289-95de-621be064af11-kube-api-access-wwr4k\") pod \"2b645dad-26c4-4289-95de-621be064af11\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " Apr 24 21:42:38.971718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.971466 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-kserve-provision-location\") pod \"2b645dad-26c4-4289-95de-621be064af11\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " Apr 24 21:42:38.971718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.971495 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b645dad-26c4-4289-95de-621be064af11-tls-certs\") pod \"2b645dad-26c4-4289-95de-621be064af11\" (UID: \"2b645dad-26c4-4289-95de-621be064af11\") " Apr 24 21:42:38.971856 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.971718 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2b645dad-26c4-4289-95de-621be064af11" (UID: "2b645dad-26c4-4289-95de-621be064af11"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:38.971856 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.971730 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2b645dad-26c4-4289-95de-621be064af11" (UID: "2b645dad-26c4-4289-95de-621be064af11"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:38.972070 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.972041 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2b645dad-26c4-4289-95de-621be064af11" (UID: "2b645dad-26c4-4289-95de-621be064af11"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:38.972361 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.972338 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2b645dad-26c4-4289-95de-621be064af11" (UID: "2b645dad-26c4-4289-95de-621be064af11"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:38.973798 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.973768 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b645dad-26c4-4289-95de-621be064af11-kube-api-access-wwr4k" (OuterVolumeSpecName: "kube-api-access-wwr4k") pod "2b645dad-26c4-4289-95de-621be064af11" (UID: "2b645dad-26c4-4289-95de-621be064af11"). InnerVolumeSpecName "kube-api-access-wwr4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:42:38.973874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:38.973852 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b645dad-26c4-4289-95de-621be064af11-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2b645dad-26c4-4289-95de-621be064af11" (UID: "2b645dad-26c4-4289-95de-621be064af11"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:42:39.072260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.072218 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwr4k\" (UniqueName: \"kubernetes.io/projected/2b645dad-26c4-4289-95de-621be064af11-kube-api-access-wwr4k\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:42:39.072260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.072254 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:42:39.072260 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.072264 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b645dad-26c4-4289-95de-621be064af11-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:42:39.072605 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.072275 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:42:39.072605 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.072284 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:42:39.072605 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.072292 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2b645dad-26c4-4289-95de-621be064af11-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:42:39.110957 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.110921 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b645dad-26c4-4289-95de-621be064af11" containerID="e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd" exitCode=0 Apr 24 21:42:39.111147 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.111009 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" Apr 24 21:42:39.111147 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.111005 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" event={"ID":"2b645dad-26c4-4289-95de-621be064af11","Type":"ContainerDied","Data":"e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd"} Apr 24 21:42:39.111147 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.111052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs" event={"ID":"2b645dad-26c4-4289-95de-621be064af11","Type":"ContainerDied","Data":"9edfb3f916c616162ab585757e1a291519b47e568ccf109b21ebd3471badf47a"} Apr 24 21:42:39.111147 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.111073 2569 scope.go:117] "RemoveContainer" containerID="e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd" Apr 24 21:42:39.121514 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.121495 2569 scope.go:117] "RemoveContainer" containerID="e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2" Apr 24 21:42:39.130369 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.130344 2569 scope.go:117] "RemoveContainer" containerID="96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969" Apr 24 21:42:39.134520 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.134494 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs"] Apr 24 21:42:39.139083 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.139058 2569 scope.go:117] "RemoveContainer" containerID="e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd" Apr 24 21:42:39.139401 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:42:39.139381 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd\": container with ID starting with e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd not found: ID does not exist" containerID="e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd" Apr 24 21:42:39.139488 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.139411 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd"} err="failed to get container status \"e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd\": rpc error: code = NotFound desc = could not find container \"e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd\": container with ID starting with e5473b98ca47d3fb5be3956fd87e322ad8783067e17963276c4a64b65e0a51fd not found: ID does not exist" Apr 24 21:42:39.139488 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.139433 2569 scope.go:117] "RemoveContainer" containerID="e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2" Apr 24 21:42:39.139774 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:42:39.139737 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2\": container with ID starting with e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2 not found: ID does not exist" containerID="e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2" Apr 24 21:42:39.139774 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.139768 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2"} err="failed to get container status \"e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2\": rpc error: code = NotFound desc = could not find container \"e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2\": container with ID starting with e0d118b87af8fc18448abb98a8c97a008552830381f2bc1b13b2f819d16ad1e2 not found: ID does not exist" Apr 24 21:42:39.139940 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.139786 2569 scope.go:117] "RemoveContainer" containerID="96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969" Apr 24 21:42:39.140086 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:42:39.140067 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969\": container with ID starting with 96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969 not found: ID does not exist" containerID="96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969" Apr 24 21:42:39.140138 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.140094 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969"} err="failed to get container status \"96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969\": rpc error: code = NotFound desc = could not find container \"96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969\": container with ID starting with 96e73778590836a6ead053e5cdbeae77cfd2b6fa499d5e9fa09e5e6343f17969 not found: ID does not exist" Apr 24 21:42:39.141203 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.141184 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegsjhs"] Apr 24 21:42:39.333412 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:39.333383 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b645dad-26c4-4289-95de-621be064af11" path="/var/lib/kubelet/pods/2b645dad-26c4-4289-95de-621be064af11/volumes" Apr 24 21:42:45.734436 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734404 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx"] Apr 24 21:42:45.734829 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734793 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="tokenizer" Apr 24 21:42:45.734829 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734806 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="tokenizer" Apr 24 21:42:45.734829 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734817 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="main" Apr 24 21:42:45.734829 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734822 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="main" Apr 24 21:42:45.734977 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734838 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="storage-initializer" Apr 24 21:42:45.734977 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734845 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="storage-initializer" Apr 24 21:42:45.734977 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734903 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="main" Apr 24 21:42:45.734977 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.734912 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b645dad-26c4-4289-95de-621be064af11" containerName="tokenizer" Apr 24 21:42:45.740231 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.740206 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.743073 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.743048 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-4tmmv\"" Apr 24 21:42:45.743220 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.743080 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 24 21:42:45.752482 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.752433 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx"] Apr 24 21:42:45.828988 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.828954 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.829165 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.829006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.829165 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.829036 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab494c3d-4306-432f-9265-1dd661661806-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.829165 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.829087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.829165 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.829117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.829165 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.829145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8qtb\" (UniqueName: \"kubernetes.io/projected/ab494c3d-4306-432f-9265-1dd661661806-kube-api-access-q8qtb\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930028 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.929990 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930218 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.930043 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930218 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.930083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab494c3d-4306-432f-9265-1dd661661806-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930218 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.930113 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930218 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.930137 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930218 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.930176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8qtb\" (UniqueName: \"kubernetes.io/projected/ab494c3d-4306-432f-9265-1dd661661806-kube-api-access-q8qtb\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930607 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.930575 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930745 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.930642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.930745 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.930696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.931121 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.931094 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.933072 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.933045 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab494c3d-4306-432f-9265-1dd661661806-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:45.940073 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:45.940035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8qtb\" (UniqueName: \"kubernetes.io/projected/ab494c3d-4306-432f-9265-1dd661661806-kube-api-access-q8qtb\") pod \"custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:46.050797 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:46.050751 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:46.186433 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:46.186397 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx"] Apr 24 21:42:46.187707 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:42:46.187658 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab494c3d_4306_432f_9265_1dd661661806.slice/crio-fb21de13cfb788d20e475de11698ebef317545227d22db3afa035077ed7732cb WatchSource:0}: Error finding container fb21de13cfb788d20e475de11698ebef317545227d22db3afa035077ed7732cb: Status 404 returned error can't find the container with id fb21de13cfb788d20e475de11698ebef317545227d22db3afa035077ed7732cb Apr 24 21:42:46.189597 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:46.189579 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:42:47.149521 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:47.149488 2569 generic.go:358] "Generic (PLEG): container finished" podID="ab494c3d-4306-432f-9265-1dd661661806" containerID="fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a" exitCode=0 Apr 24 21:42:47.150075 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:47.149534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" event={"ID":"ab494c3d-4306-432f-9265-1dd661661806","Type":"ContainerDied","Data":"fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a"} Apr 24 21:42:47.150075 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:47.149556 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" event={"ID":"ab494c3d-4306-432f-9265-1dd661661806","Type":"ContainerStarted","Data":"fb21de13cfb788d20e475de11698ebef317545227d22db3afa035077ed7732cb"} Apr 24 21:42:48.156343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:48.156259 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" event={"ID":"ab494c3d-4306-432f-9265-1dd661661806","Type":"ContainerStarted","Data":"6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688"} Apr 24 21:42:48.156343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:48.156300 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" event={"ID":"ab494c3d-4306-432f-9265-1dd661661806","Type":"ContainerStarted","Data":"835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe"} Apr 24 21:42:48.156781 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:48.156390 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:48.182918 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:48.182866 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" podStartSLOduration=3.182850857 podStartE2EDuration="3.182850857s" podCreationTimestamp="2026-04-24 21:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:42:48.181495531 +0000 UTC m=+979.421848793" watchObservedRunningTime="2026-04-24 21:42:48.182850857 +0000 UTC m=+979.423204117" Apr 24 21:42:56.051826 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:56.051787 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:56.052244 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:56.051841 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:56.054836 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:56.054809 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:42:56.190731 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:42:56.190699 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:43:17.195204 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:43:17.195172 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:44:42.049328 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:42.049293 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx"] Apr 24 21:44:42.049957 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:42.049653 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="main" containerID="cri-o://835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe" gracePeriod=30 Apr 24 21:44:42.049957 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:42.049743 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="tokenizer" containerID="cri-o://6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688" gracePeriod=30 Apr 24 21:44:42.613898 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:42.613864 2569 generic.go:358] "Generic (PLEG): container finished" podID="ab494c3d-4306-432f-9265-1dd661661806" containerID="835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe" exitCode=0 Apr 24 21:44:42.614073 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:42.613941 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" event={"ID":"ab494c3d-4306-432f-9265-1dd661661806","Type":"ContainerDied","Data":"835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe"} Apr 24 21:44:43.401373 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.401343 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:44:43.543095 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543063 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-uds\") pod \"ab494c3d-4306-432f-9265-1dd661661806\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " Apr 24 21:44:43.543095 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543101 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-cache\") pod \"ab494c3d-4306-432f-9265-1dd661661806\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " Apr 24 21:44:43.543318 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543156 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8qtb\" (UniqueName: \"kubernetes.io/projected/ab494c3d-4306-432f-9265-1dd661661806-kube-api-access-q8qtb\") pod \"ab494c3d-4306-432f-9265-1dd661661806\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " Apr 24 21:44:43.543318 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543201 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab494c3d-4306-432f-9265-1dd661661806-tls-certs\") pod \"ab494c3d-4306-432f-9265-1dd661661806\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " Apr 24 21:44:43.543318 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543223 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-tmp\") pod \"ab494c3d-4306-432f-9265-1dd661661806\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " Apr 24 21:44:43.543318 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543252 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-kserve-provision-location\") pod \"ab494c3d-4306-432f-9265-1dd661661806\" (UID: \"ab494c3d-4306-432f-9265-1dd661661806\") " Apr 24 21:44:43.543524 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543383 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ab494c3d-4306-432f-9265-1dd661661806" (UID: "ab494c3d-4306-432f-9265-1dd661661806"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:43.543524 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543457 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ab494c3d-4306-432f-9265-1dd661661806" (UID: "ab494c3d-4306-432f-9265-1dd661661806"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:43.543638 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543571 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ab494c3d-4306-432f-9265-1dd661661806" (UID: "ab494c3d-4306-432f-9265-1dd661661806"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:43.543638 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543592 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:44:43.543638 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.543608 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:44:43.544077 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.544048 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ab494c3d-4306-432f-9265-1dd661661806" (UID: "ab494c3d-4306-432f-9265-1dd661661806"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:43.545430 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.545410 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab494c3d-4306-432f-9265-1dd661661806-kube-api-access-q8qtb" (OuterVolumeSpecName: "kube-api-access-q8qtb") pod "ab494c3d-4306-432f-9265-1dd661661806" (UID: "ab494c3d-4306-432f-9265-1dd661661806"). InnerVolumeSpecName "kube-api-access-q8qtb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:44:43.545509 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.545490 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab494c3d-4306-432f-9265-1dd661661806-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ab494c3d-4306-432f-9265-1dd661661806" (UID: "ab494c3d-4306-432f-9265-1dd661661806"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:44:43.619744 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.619700 2569 generic.go:358] "Generic (PLEG): container finished" podID="ab494c3d-4306-432f-9265-1dd661661806" containerID="6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688" exitCode=0 Apr 24 21:44:43.619932 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.619757 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" event={"ID":"ab494c3d-4306-432f-9265-1dd661661806","Type":"ContainerDied","Data":"6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688"} Apr 24 21:44:43.619932 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.619787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" event={"ID":"ab494c3d-4306-432f-9265-1dd661661806","Type":"ContainerDied","Data":"fb21de13cfb788d20e475de11698ebef317545227d22db3afa035077ed7732cb"} Apr 24 21:44:43.619932 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.619788 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx" Apr 24 21:44:43.619932 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.619803 2569 scope.go:117] "RemoveContainer" containerID="6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688" Apr 24 21:44:43.629992 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.629943 2569 scope.go:117] "RemoveContainer" containerID="835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe" Apr 24 21:44:43.639293 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.639264 2569 scope.go:117] "RemoveContainer" containerID="fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a" Apr 24 21:44:43.644982 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.644953 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8qtb\" (UniqueName: \"kubernetes.io/projected/ab494c3d-4306-432f-9265-1dd661661806-kube-api-access-q8qtb\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:44:43.644982 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.644985 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab494c3d-4306-432f-9265-1dd661661806-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:44:43.644982 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.644996 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:44:43.645261 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.645004 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab494c3d-4306-432f-9265-1dd661661806-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:44:43.645552 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.645529 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx"] Apr 24 21:44:43.648685 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.648647 2569 scope.go:117] "RemoveContainer" containerID="6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688" Apr 24 21:44:43.648992 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:44:43.648968 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688\": container with ID starting with 6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688 not found: ID does not exist" containerID="6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688" Apr 24 21:44:43.649103 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.649001 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688"} err="failed to get container status \"6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688\": rpc error: code = NotFound desc = could not find container \"6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688\": container with ID starting with 6ef1c3e5ee3e1330b6eb81b508c6b567b4ae0830ea3a28647342e52a62132688 not found: ID does not exist" Apr 24 21:44:43.649103 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.649020 2569 scope.go:117] "RemoveContainer" containerID="835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe" Apr 24 21:44:43.649331 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:44:43.649310 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe\": container with ID starting with 835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe not found: ID does not exist" containerID="835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe" Apr 24 21:44:43.649381 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.649340 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe"} err="failed to get container status \"835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe\": rpc error: code = NotFound desc = could not find container \"835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe\": container with ID starting with 835d9a4b0b047f62dced1550395889db2630a14d0d93c3f986bbf58b0e34cabe not found: ID does not exist" Apr 24 21:44:43.649381 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.649365 2569 scope.go:117] "RemoveContainer" containerID="fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a" Apr 24 21:44:43.649654 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:44:43.649635 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a\": container with ID starting with fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a not found: ID does not exist" containerID="fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a" Apr 24 21:44:43.649734 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.649663 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a"} err="failed to get container status \"fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a\": rpc error: code = NotFound desc = could not find container \"fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a\": container with ID starting with fe43274db9605b7814b65acb0440f4bf9c95cc113e9d20136c9068660450202a not found: ID does not exist" Apr 24 21:44:43.654149 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:43.654127 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-679f765b4t4fx"] Apr 24 21:44:45.335759 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:45.335713 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab494c3d-4306-432f-9265-1dd661661806" path="/var/lib/kubelet/pods/ab494c3d-4306-432f-9265-1dd661661806/volumes" Apr 24 21:44:51.162811 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.162769 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg"] Apr 24 21:44:51.163274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.163140 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="storage-initializer" Apr 24 21:44:51.163274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.163150 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="storage-initializer" Apr 24 21:44:51.163274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.163161 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="main" Apr 24 21:44:51.163274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.163166 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="main" Apr 24 21:44:51.163274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.163176 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="tokenizer" Apr 24 21:44:51.163274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.163181 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="tokenizer" Apr 24 21:44:51.163274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.163233 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="main" Apr 24 21:44:51.163274 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.163241 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab494c3d-4306-432f-9265-1dd661661806" containerName="tokenizer" Apr 24 21:44:51.168319 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.168299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.171225 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.171201 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 24 21:44:51.171368 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.171203 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-c7mhf\"" Apr 24 21:44:51.177619 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.177597 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg"] Apr 24 21:44:51.197838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.197793 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.198014 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.197865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.198014 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.197886 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.198014 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.197915 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.198014 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.197985 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.198193 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.198051 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbtdl\" (UniqueName: \"kubernetes.io/projected/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kube-api-access-lbtdl\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299187 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299383 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299217 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299383 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299383 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299383 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299590 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299426 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbtdl\" (UniqueName: \"kubernetes.io/projected/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kube-api-access-lbtdl\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299590 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299716 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299639 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299771 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299714 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.299825 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.299780 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.301988 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.301967 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.309526 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.309502 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbtdl\" (UniqueName: \"kubernetes.io/projected/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kube-api-access-lbtdl\") pod \"router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.479007 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.478922 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:51.618045 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.618016 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg"] Apr 24 21:44:51.619963 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:44:51.619933 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7bf2ce_9814_4fc2_b9e5_6376d8314444.slice/crio-b410002c0a12c41add4449023ddbc8f848c922941e11c2ba88d34546f6f5cac2 WatchSource:0}: Error finding container b410002c0a12c41add4449023ddbc8f848c922941e11c2ba88d34546f6f5cac2: Status 404 returned error can't find the container with id b410002c0a12c41add4449023ddbc8f848c922941e11c2ba88d34546f6f5cac2 Apr 24 21:44:51.656091 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:51.656060 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" event={"ID":"2a7bf2ce-9814-4fc2-b9e5-6376d8314444","Type":"ContainerStarted","Data":"b410002c0a12c41add4449023ddbc8f848c922941e11c2ba88d34546f6f5cac2"} Apr 24 21:44:52.661620 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:52.661520 2569 generic.go:358] "Generic (PLEG): container finished" podID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerID="d54b5c124e4932af608bc587bb92019d6cb48f0919543ad9c6bfc87d573427b8" exitCode=0 Apr 24 21:44:52.661620 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:52.661560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" event={"ID":"2a7bf2ce-9814-4fc2-b9e5-6376d8314444","Type":"ContainerDied","Data":"d54b5c124e4932af608bc587bb92019d6cb48f0919543ad9c6bfc87d573427b8"} Apr 24 21:44:53.667423 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:53.667383 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" event={"ID":"2a7bf2ce-9814-4fc2-b9e5-6376d8314444","Type":"ContainerStarted","Data":"90bb30c102691434b6f99a2c6f3d858ecb314834270b526750e6328966200a40"} Apr 24 21:44:53.667423 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:53.667430 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" event={"ID":"2a7bf2ce-9814-4fc2-b9e5-6376d8314444","Type":"ContainerStarted","Data":"7730f27a5d7d2da8225efb4dee1e0829f134515ebeb4fd16bd5761ffab2cc955"} Apr 24 21:44:53.667868 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:53.667489 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:44:53.690296 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:44:53.690243 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" podStartSLOduration=2.690223293 podStartE2EDuration="2.690223293s" podCreationTimestamp="2026-04-24 21:44:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:44:53.68858934 +0000 UTC m=+1104.928942603" watchObservedRunningTime="2026-04-24 21:44:53.690223293 +0000 UTC m=+1104.930576554" Apr 24 21:45:01.479785 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:45:01.479747 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:45:01.479785 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:45:01.479787 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:45:01.482509 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:45:01.482481 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:45:01.699812 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:45:01.699782 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:45:22.703798 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:45:22.703772 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:46:29.377762 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:29.377730 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:46:29.380267 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:29.380243 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:46:46.841455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:46.841410 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg"] Apr 24 21:46:46.842094 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:46.841754 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="main" containerID="cri-o://7730f27a5d7d2da8225efb4dee1e0829f134515ebeb4fd16bd5761ffab2cc955" gracePeriod=30 Apr 24 21:46:46.842094 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:46.841817 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="tokenizer" containerID="cri-o://90bb30c102691434b6f99a2c6f3d858ecb314834270b526750e6328966200a40" gracePeriod=30 Apr 24 21:46:47.120152 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:47.120065 2569 generic.go:358] "Generic (PLEG): container finished" podID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerID="7730f27a5d7d2da8225efb4dee1e0829f134515ebeb4fd16bd5761ffab2cc955" exitCode=0 Apr 24 21:46:47.120152 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:47.120137 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" event={"ID":"2a7bf2ce-9814-4fc2-b9e5-6376d8314444","Type":"ContainerDied","Data":"7730f27a5d7d2da8225efb4dee1e0829f134515ebeb4fd16bd5761ffab2cc955"} Apr 24 21:46:48.128610 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.128567 2569 generic.go:358] "Generic (PLEG): container finished" podID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerID="90bb30c102691434b6f99a2c6f3d858ecb314834270b526750e6328966200a40" exitCode=0 Apr 24 21:46:48.128976 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.128634 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" event={"ID":"2a7bf2ce-9814-4fc2-b9e5-6376d8314444","Type":"ContainerDied","Data":"90bb30c102691434b6f99a2c6f3d858ecb314834270b526750e6328966200a40"} Apr 24 21:46:48.192919 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.192891 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.217122 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kserve-provision-location\") pod \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.217213 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-cache\") pod \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.217269 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbtdl\" (UniqueName: \"kubernetes.io/projected/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kube-api-access-lbtdl\") pod \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.217298 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-uds\") pod \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.217366 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tls-certs\") pod \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.217401 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-tmp\") pod \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\" (UID: \"2a7bf2ce-9814-4fc2-b9e5-6376d8314444\") " Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.218054 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2a7bf2ce-9814-4fc2-b9e5-6376d8314444" (UID: "2a7bf2ce-9814-4fc2-b9e5-6376d8314444"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.218289 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2a7bf2ce-9814-4fc2-b9e5-6376d8314444" (UID: "2a7bf2ce-9814-4fc2-b9e5-6376d8314444"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.218306 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2a7bf2ce-9814-4fc2-b9e5-6376d8314444" (UID: "2a7bf2ce-9814-4fc2-b9e5-6376d8314444"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:48.219184 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.218882 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2a7bf2ce-9814-4fc2-b9e5-6376d8314444" (UID: "2a7bf2ce-9814-4fc2-b9e5-6376d8314444"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:48.221200 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.221167 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2a7bf2ce-9814-4fc2-b9e5-6376d8314444" (UID: "2a7bf2ce-9814-4fc2-b9e5-6376d8314444"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:48.221496 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.221471 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kube-api-access-lbtdl" (OuterVolumeSpecName: "kube-api-access-lbtdl") pod "2a7bf2ce-9814-4fc2-b9e5-6376d8314444" (UID: "2a7bf2ce-9814-4fc2-b9e5-6376d8314444"). InnerVolumeSpecName "kube-api-access-lbtdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:48.319001 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.318962 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:46:48.319001 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.319001 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:46:48.319219 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.319016 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:46:48.319219 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.319029 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:46:48.319219 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.319040 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbtdl\" (UniqueName: \"kubernetes.io/projected/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-kube-api-access-lbtdl\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:46:48.319219 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:48.319053 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a7bf2ce-9814-4fc2-b9e5-6376d8314444-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:46:49.134746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:49.134712 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" event={"ID":"2a7bf2ce-9814-4fc2-b9e5-6376d8314444","Type":"ContainerDied","Data":"b410002c0a12c41add4449023ddbc8f848c922941e11c2ba88d34546f6f5cac2"} Apr 24 21:46:49.135182 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:49.134751 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg" Apr 24 21:46:49.135182 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:49.134758 2569 scope.go:117] "RemoveContainer" containerID="90bb30c102691434b6f99a2c6f3d858ecb314834270b526750e6328966200a40" Apr 24 21:46:49.147082 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:49.147065 2569 scope.go:117] "RemoveContainer" containerID="7730f27a5d7d2da8225efb4dee1e0829f134515ebeb4fd16bd5761ffab2cc955" Apr 24 21:46:49.155530 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:49.155511 2569 scope.go:117] "RemoveContainer" containerID="d54b5c124e4932af608bc587bb92019d6cb48f0919543ad9c6bfc87d573427b8" Apr 24 21:46:49.158684 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:49.158635 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg"] Apr 24 21:46:49.161269 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:49.161249 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-7449dff6d-kmjcg"] Apr 24 21:46:49.338422 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:46:49.338390 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" path="/var/lib/kubelet/pods/2a7bf2ce-9814-4fc2-b9e5-6376d8314444/volumes" Apr 24 21:47:04.209833 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.209795 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg"] Apr 24 21:47:04.210275 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.210259 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="storage-initializer" Apr 24 21:47:04.210321 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.210277 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="storage-initializer" Apr 24 21:47:04.210321 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.210286 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="main" Apr 24 21:47:04.210321 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.210297 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="main" Apr 24 21:47:04.210321 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.210315 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="tokenizer" Apr 24 21:47:04.210321 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.210322 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="tokenizer" Apr 24 21:47:04.210474 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.210389 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="main" Apr 24 21:47:04.210474 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.210400 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a7bf2ce-9814-4fc2-b9e5-6376d8314444" containerName="tokenizer" Apr 24 21:47:04.213624 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.213607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.218392 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.218368 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-8zzfg\"" Apr 24 21:47:04.218574 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.218552 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 24 21:47:04.230183 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.230155 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg"] Apr 24 21:47:04.360287 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.360247 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.360467 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.360300 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.360467 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.360366 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.360467 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.360447 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wkkx\" (UniqueName: \"kubernetes.io/projected/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kube-api-access-9wkkx\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.360586 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.360494 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.360586 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.360561 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461354 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461252 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461354 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461307 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461354 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461351 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461628 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461628 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wkkx\" (UniqueName: \"kubernetes.io/projected/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kube-api-access-9wkkx\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461769 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461769 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461689 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461848 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461814 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461903 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461882 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.461960 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.461944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.464291 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.464272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.469902 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.469877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wkkx\" (UniqueName: \"kubernetes.io/projected/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kube-api-access-9wkkx\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.523617 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.523575 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:04.662062 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:04.662034 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg"] Apr 24 21:47:04.663821 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:47:04.663789 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7de7be_a0a3_4559_91d7_a5440e9f13fa.slice/crio-8d7c0dffd4b8128a32383ac6602f789e55724cabaffe74c07273f28e7fe99816 WatchSource:0}: Error finding container 8d7c0dffd4b8128a32383ac6602f789e55724cabaffe74c07273f28e7fe99816: Status 404 returned error can't find the container with id 8d7c0dffd4b8128a32383ac6602f789e55724cabaffe74c07273f28e7fe99816 Apr 24 21:47:05.199975 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:05.199852 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" event={"ID":"2f7de7be-a0a3-4559-91d7-a5440e9f13fa","Type":"ContainerStarted","Data":"bcb73ab31c025350de6c53b653389918bd92931e9654db258a7bc65fa815d0fa"} Apr 24 21:47:05.199975 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:05.199896 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" event={"ID":"2f7de7be-a0a3-4559-91d7-a5440e9f13fa","Type":"ContainerStarted","Data":"8d7c0dffd4b8128a32383ac6602f789e55724cabaffe74c07273f28e7fe99816"} Apr 24 21:47:06.205833 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:06.205796 2569 generic.go:358] "Generic (PLEG): container finished" podID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerID="bcb73ab31c025350de6c53b653389918bd92931e9654db258a7bc65fa815d0fa" exitCode=0 Apr 24 21:47:06.206013 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:06.205872 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" event={"ID":"2f7de7be-a0a3-4559-91d7-a5440e9f13fa","Type":"ContainerDied","Data":"bcb73ab31c025350de6c53b653389918bd92931e9654db258a7bc65fa815d0fa"} Apr 24 21:47:07.213754 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:07.213715 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" event={"ID":"2f7de7be-a0a3-4559-91d7-a5440e9f13fa","Type":"ContainerStarted","Data":"bbb653a0752e08cfc07c53a363e7b99e2f873d790c93b872b1e90413c71a6794"} Apr 24 21:47:07.214215 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:07.213986 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" event={"ID":"2f7de7be-a0a3-4559-91d7-a5440e9f13fa","Type":"ContainerStarted","Data":"981f0b72403337f428ec9449fe40119b4f547320439c35e72e50baecc5416846"} Apr 24 21:47:07.214215 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:07.214121 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:07.239404 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:07.239331 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" podStartSLOduration=3.239309696 podStartE2EDuration="3.239309696s" podCreationTimestamp="2026-04-24 21:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:07.234188381 +0000 UTC m=+1238.474541655" watchObservedRunningTime="2026-04-24 21:47:07.239309696 +0000 UTC m=+1238.479662961" Apr 24 21:47:14.523910 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:14.523873 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:14.524387 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:14.523925 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:14.525377 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:14.525335 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.54:8082/healthz\": dial tcp 10.133.0.54:8082: connect: connection refused" Apr 24 21:47:24.525344 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:24.525259 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:24.526568 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:24.526545 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:47:45.297502 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:47:45.297469 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:49:42.649338 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:42.649306 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg"] Apr 24 21:49:42.649908 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:42.649603 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="main" containerID="cri-o://981f0b72403337f428ec9449fe40119b4f547320439c35e72e50baecc5416846" gracePeriod=30 Apr 24 21:49:42.649908 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:42.649694 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="tokenizer" containerID="cri-o://bbb653a0752e08cfc07c53a363e7b99e2f873d790c93b872b1e90413c71a6794" gracePeriod=30 Apr 24 21:49:42.832108 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:42.832075 2569 generic.go:358] "Generic (PLEG): container finished" podID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerID="981f0b72403337f428ec9449fe40119b4f547320439c35e72e50baecc5416846" exitCode=0 Apr 24 21:49:42.832285 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:42.832157 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" event={"ID":"2f7de7be-a0a3-4559-91d7-a5440e9f13fa","Type":"ContainerDied","Data":"981f0b72403337f428ec9449fe40119b4f547320439c35e72e50baecc5416846"} Apr 24 21:49:43.839553 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.839520 2569 generic.go:358] "Generic (PLEG): container finished" podID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerID="bbb653a0752e08cfc07c53a363e7b99e2f873d790c93b872b1e90413c71a6794" exitCode=0 Apr 24 21:49:43.839928 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.839600 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" event={"ID":"2f7de7be-a0a3-4559-91d7-a5440e9f13fa","Type":"ContainerDied","Data":"bbb653a0752e08cfc07c53a363e7b99e2f873d790c93b872b1e90413c71a6794"} Apr 24 21:49:43.896934 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.896911 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:49:43.942707 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.942661 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-cache\") pod \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " Apr 24 21:49:43.942886 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.942718 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kserve-provision-location\") pod \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " Apr 24 21:49:43.942886 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.942753 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-tmp\") pod \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " Apr 24 21:49:43.942886 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.942773 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-uds\") pod \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " Apr 24 21:49:43.942886 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.942820 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wkkx\" (UniqueName: \"kubernetes.io/projected/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kube-api-access-9wkkx\") pod \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " Apr 24 21:49:43.942886 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.942847 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tls-certs\") pod \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\" (UID: \"2f7de7be-a0a3-4559-91d7-a5440e9f13fa\") " Apr 24 21:49:43.943140 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.942978 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2f7de7be-a0a3-4559-91d7-a5440e9f13fa" (UID: "2f7de7be-a0a3-4559-91d7-a5440e9f13fa"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:43.943140 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.943001 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2f7de7be-a0a3-4559-91d7-a5440e9f13fa" (UID: "2f7de7be-a0a3-4559-91d7-a5440e9f13fa"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:43.943140 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.943091 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2f7de7be-a0a3-4559-91d7-a5440e9f13fa" (UID: "2f7de7be-a0a3-4559-91d7-a5440e9f13fa"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:43.943270 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.943164 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:49:43.943270 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.943187 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:49:43.943784 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.943756 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2f7de7be-a0a3-4559-91d7-a5440e9f13fa" (UID: "2f7de7be-a0a3-4559-91d7-a5440e9f13fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:43.945178 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.945151 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kube-api-access-9wkkx" (OuterVolumeSpecName: "kube-api-access-9wkkx") pod "2f7de7be-a0a3-4559-91d7-a5440e9f13fa" (UID: "2f7de7be-a0a3-4559-91d7-a5440e9f13fa"). InnerVolumeSpecName "kube-api-access-9wkkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:43.945273 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:43.945246 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2f7de7be-a0a3-4559-91d7-a5440e9f13fa" (UID: "2f7de7be-a0a3-4559-91d7-a5440e9f13fa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:44.043594 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.043563 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9wkkx\" (UniqueName: \"kubernetes.io/projected/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kube-api-access-9wkkx\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:49:44.043594 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.043594 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:49:44.043827 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.043611 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:49:44.043827 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.043622 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f7de7be-a0a3-4559-91d7-a5440e9f13fa-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:49:44.845190 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.845152 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" event={"ID":"2f7de7be-a0a3-4559-91d7-a5440e9f13fa","Type":"ContainerDied","Data":"8d7c0dffd4b8128a32383ac6602f789e55724cabaffe74c07273f28e7fe99816"} Apr 24 21:49:44.845190 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.845199 2569 scope.go:117] "RemoveContainer" containerID="bbb653a0752e08cfc07c53a363e7b99e2f873d790c93b872b1e90413c71a6794" Apr 24 21:49:44.845743 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.845245 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg" Apr 24 21:49:44.855270 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.855252 2569 scope.go:117] "RemoveContainer" containerID="981f0b72403337f428ec9449fe40119b4f547320439c35e72e50baecc5416846" Apr 24 21:49:44.864299 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.864272 2569 scope.go:117] "RemoveContainer" containerID="bcb73ab31c025350de6c53b653389918bd92931e9654db258a7bc65fa815d0fa" Apr 24 21:49:44.884980 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.884949 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg"] Apr 24 21:49:44.888027 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:44.888001 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehzhfg"] Apr 24 21:49:45.332416 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:45.332375 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" path="/var/lib/kubelet/pods/2f7de7be-a0a3-4559-91d7-a5440e9f13fa/volumes" Apr 24 21:49:50.562963 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.562927 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25"] Apr 24 21:49:50.563414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.563324 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="storage-initializer" Apr 24 21:49:50.563414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.563336 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="storage-initializer" Apr 24 21:49:50.563414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.563346 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="main" Apr 24 21:49:50.563414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.563353 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="main" Apr 24 21:49:50.563414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.563365 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="tokenizer" Apr 24 21:49:50.563414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.563371 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="tokenizer" Apr 24 21:49:50.563605 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.563423 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="main" Apr 24 21:49:50.563605 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.563433 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f7de7be-a0a3-4559-91d7-a5440e9f13fa" containerName="tokenizer" Apr 24 21:49:50.568734 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.568709 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.572555 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.572526 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-xn4pt\"" Apr 24 21:49:50.572832 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.572813 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 24 21:49:50.579955 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.579926 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25"] Apr 24 21:49:50.703458 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.703420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.703458 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.703465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.703737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.703525 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.703737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.703599 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.703737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.703626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.703737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.703665 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqprn\" (UniqueName: \"kubernetes.io/projected/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kube-api-access-hqprn\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804336 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804336 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804372 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqprn\" (UniqueName: \"kubernetes.io/projected/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kube-api-access-hqprn\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804430 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804893 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804893 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804838 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.804987 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.804946 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.807032 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.807012 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.814095 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.814033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqprn\" (UniqueName: \"kubernetes.io/projected/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kube-api-access-hqprn\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:50.881857 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:50.881823 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:51.018241 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:51.018211 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25"] Apr 24 21:49:51.021286 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:49:51.021244 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c2f79c5_ca8e_4133_bcd3_44e48b3d98f9.slice/crio-6f393a93423d7d08107775f6163cf9bac16418516a2f660c5121d47971bdd418 WatchSource:0}: Error finding container 6f393a93423d7d08107775f6163cf9bac16418516a2f660c5121d47971bdd418: Status 404 returned error can't find the container with id 6f393a93423d7d08107775f6163cf9bac16418516a2f660c5121d47971bdd418 Apr 24 21:49:51.023538 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:51.023522 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:49:51.873849 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:51.873816 2569 generic.go:358] "Generic (PLEG): container finished" podID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerID="b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326" exitCode=0 Apr 24 21:49:51.874230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:51.873907 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" event={"ID":"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9","Type":"ContainerDied","Data":"b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326"} Apr 24 21:49:51.874230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:51.873944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" event={"ID":"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9","Type":"ContainerStarted","Data":"6f393a93423d7d08107775f6163cf9bac16418516a2f660c5121d47971bdd418"} Apr 24 21:49:52.880561 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:52.880522 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" event={"ID":"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9","Type":"ContainerStarted","Data":"ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b"} Apr 24 21:49:52.880561 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:52.880563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" event={"ID":"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9","Type":"ContainerStarted","Data":"150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a"} Apr 24 21:49:52.881007 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:52.880693 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:49:52.908992 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:49:52.908942 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" podStartSLOduration=2.908925736 podStartE2EDuration="2.908925736s" podCreationTimestamp="2026-04-24 21:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:52.906745695 +0000 UTC m=+1404.147098956" watchObservedRunningTime="2026-04-24 21:49:52.908925736 +0000 UTC m=+1404.149278997" Apr 24 21:50:00.881958 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:50:00.881921 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:50:00.882551 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:50:00.882032 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:50:00.884781 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:50:00.884756 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:50:00.914864 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:50:00.914837 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:50:22.925005 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:50:22.924926 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 21:51:29.408407 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:51:29.408377 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:51:29.412506 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:51:29.412482 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:52:39.518248 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:39.517965 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297"] Apr 24 21:52:39.521078 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:39.518646 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="main" containerID="cri-o://6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667" gracePeriod=30 Apr 24 21:52:39.521078 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:39.518769 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="tokenizer" containerID="cri-o://db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2" gracePeriod=30 Apr 24 21:52:40.080796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.080750 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.49:8082/healthz\": dial tcp 10.133.0.49:8082: connect: connection refused" Apr 24 21:52:40.545324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.545285 2569 generic.go:358] "Generic (PLEG): container finished" podID="e76fa608-77f5-48a2-8029-94db346c054a" containerID="6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667" exitCode=0 Apr 24 21:52:40.545748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.545364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" event={"ID":"e76fa608-77f5-48a2-8029-94db346c054a","Type":"ContainerDied","Data":"6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667"} Apr 24 21:52:40.878712 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.878661 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:52:40.969849 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.969812 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rw9w\" (UniqueName: \"kubernetes.io/projected/e76fa608-77f5-48a2-8029-94db346c054a-kube-api-access-8rw9w\") pod \"e76fa608-77f5-48a2-8029-94db346c054a\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " Apr 24 21:52:40.969849 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.969856 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e76fa608-77f5-48a2-8029-94db346c054a-tls-certs\") pod \"e76fa608-77f5-48a2-8029-94db346c054a\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " Apr 24 21:52:40.970061 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.969909 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-tmp\") pod \"e76fa608-77f5-48a2-8029-94db346c054a\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " Apr 24 21:52:40.970099 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.970063 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-kserve-provision-location\") pod \"e76fa608-77f5-48a2-8029-94db346c054a\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " Apr 24 21:52:40.970163 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.970145 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-cache\") pod \"e76fa608-77f5-48a2-8029-94db346c054a\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " Apr 24 21:52:40.970222 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.970208 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-uds\") pod \"e76fa608-77f5-48a2-8029-94db346c054a\" (UID: \"e76fa608-77f5-48a2-8029-94db346c054a\") " Apr 24 21:52:40.970283 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.970229 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e76fa608-77f5-48a2-8029-94db346c054a" (UID: "e76fa608-77f5-48a2-8029-94db346c054a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.970495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.970435 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e76fa608-77f5-48a2-8029-94db346c054a" (UID: "e76fa608-77f5-48a2-8029-94db346c054a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.970495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.970467 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:52:40.970644 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.970538 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e76fa608-77f5-48a2-8029-94db346c054a" (UID: "e76fa608-77f5-48a2-8029-94db346c054a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.970845 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.970824 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e76fa608-77f5-48a2-8029-94db346c054a" (UID: "e76fa608-77f5-48a2-8029-94db346c054a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.972601 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.972580 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e76fa608-77f5-48a2-8029-94db346c054a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e76fa608-77f5-48a2-8029-94db346c054a" (UID: "e76fa608-77f5-48a2-8029-94db346c054a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:40.972718 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:40.972608 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e76fa608-77f5-48a2-8029-94db346c054a-kube-api-access-8rw9w" (OuterVolumeSpecName: "kube-api-access-8rw9w") pod "e76fa608-77f5-48a2-8029-94db346c054a" (UID: "e76fa608-77f5-48a2-8029-94db346c054a"). InnerVolumeSpecName "kube-api-access-8rw9w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:41.071119 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.071080 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:52:41.071119 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.071112 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:52:41.071119 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.071122 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rw9w\" (UniqueName: \"kubernetes.io/projected/e76fa608-77f5-48a2-8029-94db346c054a-kube-api-access-8rw9w\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:52:41.071351 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.071133 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e76fa608-77f5-48a2-8029-94db346c054a-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:52:41.071351 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.071147 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76fa608-77f5-48a2-8029-94db346c054a-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:52:41.551085 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.551050 2569 generic.go:358] "Generic (PLEG): container finished" podID="e76fa608-77f5-48a2-8029-94db346c054a" containerID="db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2" exitCode=0 Apr 24 21:52:41.551571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.551136 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" event={"ID":"e76fa608-77f5-48a2-8029-94db346c054a","Type":"ContainerDied","Data":"db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2"} Apr 24 21:52:41.551571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.551166 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" Apr 24 21:52:41.551571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.551180 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297" event={"ID":"e76fa608-77f5-48a2-8029-94db346c054a","Type":"ContainerDied","Data":"5e3af4251201edcf70c28d07c508f0838ebf2f919a45c1888d65c327e76124f2"} Apr 24 21:52:41.551571 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.551197 2569 scope.go:117] "RemoveContainer" containerID="db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2" Apr 24 21:52:41.560327 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.560301 2569 scope.go:117] "RemoveContainer" containerID="6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667" Apr 24 21:52:41.568443 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.568392 2569 scope.go:117] "RemoveContainer" containerID="91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1" Apr 24 21:52:41.572019 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.571997 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297"] Apr 24 21:52:41.577383 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.577363 2569 scope.go:117] "RemoveContainer" containerID="db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2" Apr 24 21:52:41.577704 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:52:41.577652 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2\": container with ID starting with db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2 not found: ID does not exist" containerID="db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2" Apr 24 21:52:41.577797 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.577719 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2"} err="failed to get container status \"db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2\": rpc error: code = NotFound desc = could not find container \"db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2\": container with ID starting with db049194adaec483d5bcac42b8ad3c61cfdf818ef49d09cb891b74b1a19111e2 not found: ID does not exist" Apr 24 21:52:41.577797 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.577747 2569 scope.go:117] "RemoveContainer" containerID="6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667" Apr 24 21:52:41.578006 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:52:41.577988 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667\": container with ID starting with 6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667 not found: ID does not exist" containerID="6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667" Apr 24 21:52:41.578067 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.578012 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667"} err="failed to get container status \"6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667\": rpc error: code = NotFound desc = could not find container \"6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667\": container with ID starting with 6b09a88cb47fe05ffa4fc707e831683a64b727fcd24a6c8ccf97b07bf2a2a667 not found: ID does not exist" Apr 24 21:52:41.578067 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.578031 2569 scope.go:117] "RemoveContainer" containerID="91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1" Apr 24 21:52:41.578265 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:52:41.578247 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1\": container with ID starting with 91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1 not found: ID does not exist" containerID="91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1" Apr 24 21:52:41.578323 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.578269 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1"} err="failed to get container status \"91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1\": rpc error: code = NotFound desc = could not find container \"91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1\": container with ID starting with 91e4ce8ef5146c9d629535f0f3d4187206d4a146f4b49cd4e6da7a08a67115d1 not found: ID does not exist" Apr 24 21:52:41.578694 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:41.578655 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6f5c5kq297"] Apr 24 21:52:43.332518 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:43.332482 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e76fa608-77f5-48a2-8029-94db346c054a" path="/var/lib/kubelet/pods/e76fa608-77f5-48a2-8029-94db346c054a/volumes" Apr 24 21:52:51.197790 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.197746 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v"] Apr 24 21:52:51.198351 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.198329 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="main" Apr 24 21:52:51.198449 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.198354 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="main" Apr 24 21:52:51.198449 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.198388 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="tokenizer" Apr 24 21:52:51.198449 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.198397 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="tokenizer" Apr 24 21:52:51.198449 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.198416 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="storage-initializer" Apr 24 21:52:51.198449 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.198426 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="storage-initializer" Apr 24 21:52:51.198729 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.198530 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="tokenizer" Apr 24 21:52:51.198729 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.198550 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e76fa608-77f5-48a2-8029-94db346c054a" containerName="main" Apr 24 21:52:51.203432 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.203409 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.206305 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.206285 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 24 21:52:51.212894 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.212871 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v"] Apr 24 21:52:51.262437 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.262403 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.262589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.262444 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.262589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.262508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.262589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.262577 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-home\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.262738 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.262595 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-dshm\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.262738 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.262636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.262738 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.262657 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfztm\" (UniqueName: \"kubernetes.io/projected/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kube-api-access-bfztm\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.363418 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.363385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-home\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.363418 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.363421 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-dshm\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.363614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.363441 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.363614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.363462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfztm\" (UniqueName: \"kubernetes.io/projected/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kube-api-access-bfztm\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.363614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.363514 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.363614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.363538 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.363614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.363589 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.364060 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.363994 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.364060 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.364009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.364060 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.364044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.364469 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.364099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-home\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.366102 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.366073 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-dshm\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.366373 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.366354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.371526 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.371505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfztm\" (UniqueName: \"kubernetes.io/projected/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kube-api-access-bfztm\") pod \"scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.414633 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.414601 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn"] Apr 24 21:52:51.418602 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.418578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.421155 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.421130 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-ml2dt\"" Apr 24 21:52:51.429610 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.429584 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn"] Apr 24 21:52:51.514824 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.514789 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:52:51.566102 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.565696 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz8lq\" (UniqueName: \"kubernetes.io/projected/0e9ccfa6-0585-4669-b547-08a74b97aef5-kube-api-access-tz8lq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.566102 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.565749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.566102 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.565789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9ccfa6-0585-4669-b547-08a74b97aef5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.566102 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.565806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.566102 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.565827 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.566102 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.565860 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.660437 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.660404 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v"] Apr 24 21:52:51.661357 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:52:51.661330 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76afd7b3_9fae_4333_b5f0_99ffbf869ef9.slice/crio-196e70dfc29268dbbda52921f52c5c10cc0be62c72090e327ef751459ea642e4 WatchSource:0}: Error finding container 196e70dfc29268dbbda52921f52c5c10cc0be62c72090e327ef751459ea642e4: Status 404 returned error can't find the container with id 196e70dfc29268dbbda52921f52c5c10cc0be62c72090e327ef751459ea642e4 Apr 24 21:52:51.667028 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz8lq\" (UniqueName: \"kubernetes.io/projected/0e9ccfa6-0585-4669-b547-08a74b97aef5-kube-api-access-tz8lq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.667136 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.667136 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667101 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9ccfa6-0585-4669-b547-08a74b97aef5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.667369 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.667499 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667397 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.667499 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.667499 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.667780 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.667858 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.668016 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.667995 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.669861 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.669845 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9ccfa6-0585-4669-b547-08a74b97aef5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.675548 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.675527 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz8lq\" (UniqueName: \"kubernetes.io/projected/0e9ccfa6-0585-4669-b547-08a74b97aef5-kube-api-access-tz8lq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.730152 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.730104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:51.869537 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:51.869412 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn"] Apr 24 21:52:51.871681 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:52:51.871627 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9ccfa6_0585_4669_b547_08a74b97aef5.slice/crio-2dcd97a61fabb4afbdb6b533c810082933ff0bd8482b4e3d6e19b90bfd644e36 WatchSource:0}: Error finding container 2dcd97a61fabb4afbdb6b533c810082933ff0bd8482b4e3d6e19b90bfd644e36: Status 404 returned error can't find the container with id 2dcd97a61fabb4afbdb6b533c810082933ff0bd8482b4e3d6e19b90bfd644e36 Apr 24 21:52:52.603326 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:52.603281 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" event={"ID":"76afd7b3-9fae-4333-b5f0-99ffbf869ef9","Type":"ContainerStarted","Data":"f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7"} Apr 24 21:52:52.603802 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:52.603334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" event={"ID":"76afd7b3-9fae-4333-b5f0-99ffbf869ef9","Type":"ContainerStarted","Data":"196e70dfc29268dbbda52921f52c5c10cc0be62c72090e327ef751459ea642e4"} Apr 24 21:52:52.605018 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:52.604976 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" event={"ID":"0e9ccfa6-0585-4669-b547-08a74b97aef5","Type":"ContainerStarted","Data":"f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a"} Apr 24 21:52:52.605170 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:52.605045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" event={"ID":"0e9ccfa6-0585-4669-b547-08a74b97aef5","Type":"ContainerStarted","Data":"2dcd97a61fabb4afbdb6b533c810082933ff0bd8482b4e3d6e19b90bfd644e36"} Apr 24 21:52:53.611619 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:53.611584 2569 generic.go:358] "Generic (PLEG): container finished" podID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerID="f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a" exitCode=0 Apr 24 21:52:53.612035 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:53.611664 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" event={"ID":"0e9ccfa6-0585-4669-b547-08a74b97aef5","Type":"ContainerDied","Data":"f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a"} Apr 24 21:52:54.617744 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:54.617702 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" event={"ID":"0e9ccfa6-0585-4669-b547-08a74b97aef5","Type":"ContainerStarted","Data":"cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac"} Apr 24 21:52:54.617744 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:54.617746 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" event={"ID":"0e9ccfa6-0585-4669-b547-08a74b97aef5","Type":"ContainerStarted","Data":"55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144"} Apr 24 21:52:54.618252 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:54.617844 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:52:54.643748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:52:54.643695 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" podStartSLOduration=3.643656699 podStartE2EDuration="3.643656699s" podCreationTimestamp="2026-04-24 21:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:52:54.641799517 +0000 UTC m=+1585.882152800" watchObservedRunningTime="2026-04-24 21:52:54.643656699 +0000 UTC m=+1585.884009959" Apr 24 21:53:01.731049 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:53:01.731004 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:53:01.731495 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:53:01.731156 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:53:01.733717 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:53:01.733692 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:53:02.652079 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:53:02.652044 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:53:24.661153 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:53:24.661068 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:54:03.904200 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:03.904164 2569 generic.go:358] "Generic (PLEG): container finished" podID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" containerID="f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7" exitCode=0 Apr 24 21:54:03.904816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:03.904244 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" event={"ID":"76afd7b3-9fae-4333-b5f0-99ffbf869ef9","Type":"ContainerDied","Data":"f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7"} Apr 24 21:54:04.910586 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:04.910547 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" event={"ID":"76afd7b3-9fae-4333-b5f0-99ffbf869ef9","Type":"ContainerStarted","Data":"a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f"} Apr 24 21:54:04.932317 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:04.932254 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" podStartSLOduration=73.932235405 podStartE2EDuration="1m13.932235405s" podCreationTimestamp="2026-04-24 21:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:54:04.930816769 +0000 UTC m=+1656.171170030" watchObservedRunningTime="2026-04-24 21:54:04.932235405 +0000 UTC m=+1656.172588666" Apr 24 21:54:11.515239 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:11.515203 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:54:11.515782 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:11.515293 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:54:11.528151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:11.528126 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:54:11.948431 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:11.948345 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:54:13.484127 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:13.484094 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v"] Apr 24 21:54:13.486343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:13.486320 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn"] Apr 24 21:54:13.486648 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:13.486621 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="main" containerID="cri-o://55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144" gracePeriod=30 Apr 24 21:54:13.486748 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:13.486696 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="tokenizer" containerID="cri-o://cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac" gracePeriod=30 Apr 24 21:54:13.947589 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:13.947554 2569 generic.go:358] "Generic (PLEG): container finished" podID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerID="55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144" exitCode=0 Apr 24 21:54:13.947788 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:13.947637 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" event={"ID":"0e9ccfa6-0585-4669-b547-08a74b97aef5","Type":"ContainerDied","Data":"55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144"} Apr 24 21:54:14.071641 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:14.071600 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-self-signed-certs: secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 24 21:54:14.071829 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:14.071731 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs podName:76afd7b3-9fae-4333-b5f0-99ffbf869ef9 nodeName:}" failed. No retries permitted until 2026-04-24 21:54:14.57170803 +0000 UTC m=+1665.812061269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs") pod "scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9") : secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 24 21:54:14.577501 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:14.577465 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-self-signed-certs: secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 24 21:54:14.577923 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:14.577558 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs podName:76afd7b3-9fae-4333-b5f0-99ffbf869ef9 nodeName:}" failed. No retries permitted until 2026-04-24 21:54:15.577538918 +0000 UTC m=+1666.817892157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs") pod "scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9") : secret "scheduler-configmap-ref-test-kserve-self-signed-certs" not found Apr 24 21:54:14.659494 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:54:14.659466 2569 logging.go:55] [core] [Channel #992 SubChannel #993]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.57:9003", ServerName: "10.133.0.57:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.57:9003: connect: connection refused" Apr 24 21:54:14.869740 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.869713 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:54:14.953782 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.953751 2569 generic.go:358] "Generic (PLEG): container finished" podID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerID="cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac" exitCode=0 Apr 24 21:54:14.953935 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.953831 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" Apr 24 21:54:14.953935 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.953833 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" event={"ID":"0e9ccfa6-0585-4669-b547-08a74b97aef5","Type":"ContainerDied","Data":"cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac"} Apr 24 21:54:14.953935 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.953873 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" event={"ID":"0e9ccfa6-0585-4669-b547-08a74b97aef5","Type":"ContainerDied","Data":"2dcd97a61fabb4afbdb6b533c810082933ff0bd8482b4e3d6e19b90bfd644e36"} Apr 24 21:54:14.953935 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.953888 2569 scope.go:117] "RemoveContainer" containerID="cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac" Apr 24 21:54:14.954208 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.954184 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" podUID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" containerName="main" containerID="cri-o://a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f" gracePeriod=30 Apr 24 21:54:14.962745 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.962727 2569 scope.go:117] "RemoveContainer" containerID="55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144" Apr 24 21:54:14.971051 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.971034 2569 scope.go:117] "RemoveContainer" containerID="f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a" Apr 24 21:54:14.979239 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.979217 2569 scope.go:117] "RemoveContainer" containerID="cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac" Apr 24 21:54:14.979517 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:14.979498 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac\": container with ID starting with cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac not found: ID does not exist" containerID="cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac" Apr 24 21:54:14.979580 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.979531 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac"} err="failed to get container status \"cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac\": rpc error: code = NotFound desc = could not find container \"cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac\": container with ID starting with cd20ec55e09edf5e7d63ec96c9ae22349989bce9f5e029378594a9421cec58ac not found: ID does not exist" Apr 24 21:54:14.979580 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.979551 2569 scope.go:117] "RemoveContainer" containerID="55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144" Apr 24 21:54:14.979857 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:14.979839 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144\": container with ID starting with 55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144 not found: ID does not exist" containerID="55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144" Apr 24 21:54:14.979909 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.979864 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144"} err="failed to get container status \"55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144\": rpc error: code = NotFound desc = could not find container \"55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144\": container with ID starting with 55aeb56e0884ef44d5658a4b720de013947f3dcd35ee9b66545986d09076b144 not found: ID does not exist" Apr 24 21:54:14.979909 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.979883 2569 scope.go:117] "RemoveContainer" containerID="f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a" Apr 24 21:54:14.980087 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:14.980072 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a\": container with ID starting with f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a not found: ID does not exist" containerID="f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a" Apr 24 21:54:14.980128 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.980091 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a"} err="failed to get container status \"f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a\": rpc error: code = NotFound desc = could not find container \"f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a\": container with ID starting with f70770ed1cd004c833b37bed9f3c6ce6310bb29e1441b64b34dc21f545319e0a not found: ID does not exist" Apr 24 21:54:14.981379 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981364 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-tmp\") pod \"0e9ccfa6-0585-4669-b547-08a74b97aef5\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " Apr 24 21:54:14.981443 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981399 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9ccfa6-0585-4669-b547-08a74b97aef5-tls-certs\") pod \"0e9ccfa6-0585-4669-b547-08a74b97aef5\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " Apr 24 21:54:14.981443 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981431 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz8lq\" (UniqueName: \"kubernetes.io/projected/0e9ccfa6-0585-4669-b547-08a74b97aef5-kube-api-access-tz8lq\") pod \"0e9ccfa6-0585-4669-b547-08a74b97aef5\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " Apr 24 21:54:14.981518 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981446 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-cache\") pod \"0e9ccfa6-0585-4669-b547-08a74b97aef5\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " Apr 24 21:54:14.981518 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981469 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-uds\") pod \"0e9ccfa6-0585-4669-b547-08a74b97aef5\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " Apr 24 21:54:14.981518 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981485 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-kserve-provision-location\") pod \"0e9ccfa6-0585-4669-b547-08a74b97aef5\" (UID: \"0e9ccfa6-0585-4669-b547-08a74b97aef5\") " Apr 24 21:54:14.981730 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981706 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0e9ccfa6-0585-4669-b547-08a74b97aef5" (UID: "0e9ccfa6-0585-4669-b547-08a74b97aef5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:14.981793 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981730 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0e9ccfa6-0585-4669-b547-08a74b97aef5" (UID: "0e9ccfa6-0585-4669-b547-08a74b97aef5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:14.981859 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.981836 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0e9ccfa6-0585-4669-b547-08a74b97aef5" (UID: "0e9ccfa6-0585-4669-b547-08a74b97aef5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:14.982253 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.982230 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0e9ccfa6-0585-4669-b547-08a74b97aef5" (UID: "0e9ccfa6-0585-4669-b547-08a74b97aef5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:14.983566 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.983546 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e9ccfa6-0585-4669-b547-08a74b97aef5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0e9ccfa6-0585-4669-b547-08a74b97aef5" (UID: "0e9ccfa6-0585-4669-b547-08a74b97aef5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:54:14.983631 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:14.983608 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9ccfa6-0585-4669-b547-08a74b97aef5-kube-api-access-tz8lq" (OuterVolumeSpecName: "kube-api-access-tz8lq") pod "0e9ccfa6-0585-4669-b547-08a74b97aef5" (UID: "0e9ccfa6-0585-4669-b547-08a74b97aef5"). InnerVolumeSpecName "kube-api-access-tz8lq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:54:15.082559 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.082517 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.082559 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.082552 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.082559 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.082562 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.082819 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.082572 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e9ccfa6-0585-4669-b547-08a74b97aef5-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.082819 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.082580 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tz8lq\" (UniqueName: \"kubernetes.io/projected/0e9ccfa6-0585-4669-b547-08a74b97aef5-kube-api-access-tz8lq\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.082819 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.082589 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0e9ccfa6-0585-4669-b547-08a74b97aef5-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.235374 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.235349 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:54:15.278157 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.278121 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn"] Apr 24 21:54:15.285952 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.285920 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn"] Apr 24 21:54:15.333115 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.333081 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" path="/var/lib/kubelet/pods/0e9ccfa6-0585-4669-b547-08a74b97aef5/volumes" Apr 24 21:54:15.385308 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385277 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs\") pod \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " Apr 24 21:54:15.385308 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385316 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-model-cache\") pod \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " Apr 24 21:54:15.385544 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385345 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-dshm\") pod \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " Apr 24 21:54:15.385544 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385375 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-home\") pod \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " Apr 24 21:54:15.385544 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385408 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tmp-dir\") pod \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " Apr 24 21:54:15.385544 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385492 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kserve-provision-location\") pod \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " Apr 24 21:54:15.385544 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385517 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfztm\" (UniqueName: \"kubernetes.io/projected/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kube-api-access-bfztm\") pod \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\" (UID: \"76afd7b3-9fae-4333-b5f0-99ffbf869ef9\") " Apr 24 21:54:15.385835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385564 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-model-cache" (OuterVolumeSpecName: "model-cache") pod "76afd7b3-9fae-4333-b5f0-99ffbf869ef9" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:15.385835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385662 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-home" (OuterVolumeSpecName: "home") pod "76afd7b3-9fae-4333-b5f0-99ffbf869ef9" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:15.385920 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385856 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-model-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.385920 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385876 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-home\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.386020 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.385942 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "76afd7b3-9fae-4333-b5f0-99ffbf869ef9" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:15.387693 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.387650 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-dshm" (OuterVolumeSpecName: "dshm") pod "76afd7b3-9fae-4333-b5f0-99ffbf869ef9" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:15.387818 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.387705 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kube-api-access-bfztm" (OuterVolumeSpecName: "kube-api-access-bfztm") pod "76afd7b3-9fae-4333-b5f0-99ffbf869ef9" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9"). InnerVolumeSpecName "kube-api-access-bfztm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:54:15.387818 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.387761 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "76afd7b3-9fae-4333-b5f0-99ffbf869ef9" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:54:15.486584 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.486506 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bfztm\" (UniqueName: \"kubernetes.io/projected/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kube-api-access-bfztm\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.486584 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.486535 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.486584 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.486545 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-dshm\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.486584 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.486553 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-tmp-dir\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:15.659584 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.659541 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fbbfsntpn" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.57:9003\" within 1s: context deadline exceeded" Apr 24 21:54:15.959337 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.959306 2569 generic.go:358] "Generic (PLEG): container finished" podID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" containerID="a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f" exitCode=0 Apr 24 21:54:15.959535 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.959385 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" Apr 24 21:54:15.959535 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.959386 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" event={"ID":"76afd7b3-9fae-4333-b5f0-99ffbf869ef9","Type":"ContainerDied","Data":"a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f"} Apr 24 21:54:15.959535 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.959433 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v" event={"ID":"76afd7b3-9fae-4333-b5f0-99ffbf869ef9","Type":"ContainerDied","Data":"196e70dfc29268dbbda52921f52c5c10cc0be62c72090e327ef751459ea642e4"} Apr 24 21:54:15.959535 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.959455 2569 scope.go:117] "RemoveContainer" containerID="a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f" Apr 24 21:54:15.968455 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:15.968441 2569 scope.go:117] "RemoveContainer" containerID="f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7" Apr 24 21:54:16.028799 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:16.028777 2569 scope.go:117] "RemoveContainer" containerID="a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f" Apr 24 21:54:16.029095 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:16.029071 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f\": container with ID starting with a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f not found: ID does not exist" containerID="a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f" Apr 24 21:54:16.029163 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:16.029106 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f"} err="failed to get container status \"a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f\": rpc error: code = NotFound desc = could not find container \"a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f\": container with ID starting with a4c1c66ad8d672d32d7bfa9ece0e79dfc5be6b2162c28a9c9c702ffcba55985f not found: ID does not exist" Apr 24 21:54:16.029163 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:16.029129 2569 scope.go:117] "RemoveContainer" containerID="f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7" Apr 24 21:54:16.029437 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:54:16.029414 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7\": container with ID starting with f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7 not found: ID does not exist" containerID="f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7" Apr 24 21:54:16.029498 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:16.029443 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7"} err="failed to get container status \"f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7\": rpc error: code = NotFound desc = could not find container \"f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7\": container with ID starting with f55ea3af5421478a54d28e5b826095b76423dc4de27fc76eb03ff57a0e3e2dc7 not found: ID does not exist" Apr 24 21:54:16.198548 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:16.198497 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "76afd7b3-9fae-4333-b5f0-99ffbf869ef9" (UID: "76afd7b3-9fae-4333-b5f0-99ffbf869ef9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:16.285078 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:16.285044 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v"] Apr 24 21:54:16.289155 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:16.289131 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7c65cf4c7c-vt24v"] Apr 24 21:54:16.294356 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:16.294338 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76afd7b3-9fae-4333-b5f0-99ffbf869ef9-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:54:17.332459 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:17.332414 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" path="/var/lib/kubelet/pods/76afd7b3-9fae-4333-b5f0-99ffbf869ef9/volumes" Apr 24 21:54:29.775749 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.775714 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8"] Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776082 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="main" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776093 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="main" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776104 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" containerName="storage-initializer" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776109 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" containerName="storage-initializer" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776115 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" containerName="main" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776121 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" containerName="main" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776133 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="tokenizer" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776138 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="tokenizer" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776146 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="storage-initializer" Apr 24 21:54:29.776151 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776150 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="storage-initializer" Apr 24 21:54:29.776448 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776199 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="main" Apr 24 21:54:29.776448 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776211 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e9ccfa6-0585-4669-b547-08a74b97aef5" containerName="tokenizer" Apr 24 21:54:29.776448 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.776217 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="76afd7b3-9fae-4333-b5f0-99ffbf869ef9" containerName="main" Apr 24 21:54:29.780523 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.780503 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:29.783229 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.783211 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 21:54:29.789252 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.789227 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8"] Apr 24 21:54:29.919522 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.919472 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-dshm\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:29.919732 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.919535 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-home\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:29.919732 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.919621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:29.919732 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.919705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:29.919874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.919734 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2mj\" (UniqueName: \"kubernetes.io/projected/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kube-api-access-7h2mj\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:29.919874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.919755 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:29.919874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:29.919837 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.020519 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.020484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2mj\" (UniqueName: \"kubernetes.io/projected/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kube-api-access-7h2mj\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.020519 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.020521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.020795 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.020555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.020795 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.020575 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-dshm\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.020795 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.020598 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-home\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.020795 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.020629 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.021058 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.020663 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.021272 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.021075 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.021343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.021100 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.021343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.021158 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.021343 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.021164 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-home\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.023077 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.023056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-dshm\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.023192 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.023174 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.029068 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.029000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2mj\" (UniqueName: \"kubernetes.io/projected/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kube-api-access-7h2mj\") pod \"scheduler-ha-replicas-test-kserve-67c65b665-vvzp8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.093086 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.093052 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:54:30.114779 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.114752 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk"] Apr 24 21:54:30.120947 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.120848 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.124362 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.124338 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-8xhtd\"" Apr 24 21:54:30.134568 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.134546 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk"] Apr 24 21:54:30.222566 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.222526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.222744 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.222611 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.222744 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.222688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnw6g\" (UniqueName: \"kubernetes.io/projected/587b900b-8654-41d8-9440-327de3a8f69a-kube-api-access-fnw6g\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.222744 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.222720 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.222876 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.222795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.222876 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.222827 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587b900b-8654-41d8-9440-327de3a8f69a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.230324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.230300 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8"] Apr 24 21:54:30.231795 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:54:30.231770 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3f787d_96f3_47a5_8442_71b04b5ba6f8.slice/crio-eaebdfaa112d0ff4b4fab01919900d0d0eebd7f7317cf9bfbd97cf42501d7425 WatchSource:0}: Error finding container eaebdfaa112d0ff4b4fab01919900d0d0eebd7f7317cf9bfbd97cf42501d7425: Status 404 returned error can't find the container with id eaebdfaa112d0ff4b4fab01919900d0d0eebd7f7317cf9bfbd97cf42501d7425 Apr 24 21:54:30.323284 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.323436 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323297 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnw6g\" (UniqueName: \"kubernetes.io/projected/587b900b-8654-41d8-9440-327de3a8f69a-kube-api-access-fnw6g\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.323436 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.323436 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323343 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.323723 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323691 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.323723 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323549 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587b900b-8654-41d8-9440-327de3a8f69a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.323884 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.323884 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323783 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.323884 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.323828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.324216 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.324032 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.325991 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.325974 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587b900b-8654-41d8-9440-327de3a8f69a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.332700 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.332649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnw6g\" (UniqueName: \"kubernetes.io/projected/587b900b-8654-41d8-9440-327de3a8f69a-kube-api-access-fnw6g\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.435913 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.435880 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:30.573639 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:30.573608 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk"] Apr 24 21:54:30.575042 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:54:30.575015 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod587b900b_8654_41d8_9440_327de3a8f69a.slice/crio-50c3c3f3bb064bfa719d48d38f1f2146c00f13cf450b80f27a66ba2512604f9a WatchSource:0}: Error finding container 50c3c3f3bb064bfa719d48d38f1f2146c00f13cf450b80f27a66ba2512604f9a: Status 404 returned error can't find the container with id 50c3c3f3bb064bfa719d48d38f1f2146c00f13cf450b80f27a66ba2512604f9a Apr 24 21:54:31.022911 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:31.022875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" event={"ID":"587b900b-8654-41d8-9440-327de3a8f69a","Type":"ContainerStarted","Data":"2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865"} Apr 24 21:54:31.023349 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:31.022919 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" event={"ID":"587b900b-8654-41d8-9440-327de3a8f69a","Type":"ContainerStarted","Data":"50c3c3f3bb064bfa719d48d38f1f2146c00f13cf450b80f27a66ba2512604f9a"} Apr 24 21:54:31.024562 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:31.024534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" event={"ID":"2c3f787d-96f3-47a5-8442-71b04b5ba6f8","Type":"ContainerStarted","Data":"e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c"} Apr 24 21:54:31.024699 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:31.024568 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" event={"ID":"2c3f787d-96f3-47a5-8442-71b04b5ba6f8","Type":"ContainerStarted","Data":"eaebdfaa112d0ff4b4fab01919900d0d0eebd7f7317cf9bfbd97cf42501d7425"} Apr 24 21:54:32.029358 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:32.029323 2569 generic.go:358] "Generic (PLEG): container finished" podID="587b900b-8654-41d8-9440-327de3a8f69a" containerID="2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865" exitCode=0 Apr 24 21:54:32.029782 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:32.029416 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" event={"ID":"587b900b-8654-41d8-9440-327de3a8f69a","Type":"ContainerDied","Data":"2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865"} Apr 24 21:54:33.035316 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:33.035272 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" event={"ID":"587b900b-8654-41d8-9440-327de3a8f69a","Type":"ContainerStarted","Data":"bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d"} Apr 24 21:54:33.035316 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:33.035317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" event={"ID":"587b900b-8654-41d8-9440-327de3a8f69a","Type":"ContainerStarted","Data":"69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2"} Apr 24 21:54:33.035762 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:33.035428 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:33.061838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:33.061788 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" podStartSLOduration=3.061774321 podStartE2EDuration="3.061774321s" podCreationTimestamp="2026-04-24 21:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:54:33.05917076 +0000 UTC m=+1684.299524021" watchObservedRunningTime="2026-04-24 21:54:33.061774321 +0000 UTC m=+1684.302127580" Apr 24 21:54:40.436207 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:40.436165 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:40.436207 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:40.436210 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:40.438829 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:40.438801 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:54:41.072485 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:54:41.072456 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:55:02.075664 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:55:02.075633 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:56:00.385593 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:00.385561 2569 generic.go:358] "Generic (PLEG): container finished" podID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" containerID="e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c" exitCode=0 Apr 24 21:56:00.386025 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:00.385639 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" event={"ID":"2c3f787d-96f3-47a5-8442-71b04b5ba6f8","Type":"ContainerDied","Data":"e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c"} Apr 24 21:56:00.386880 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:00.386865 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:56:01.391329 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:01.391295 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" event={"ID":"2c3f787d-96f3-47a5-8442-71b04b5ba6f8","Type":"ContainerStarted","Data":"62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4"} Apr 24 21:56:01.414605 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:01.414550 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" podStartSLOduration=92.414538056 podStartE2EDuration="1m32.414538056s" podCreationTimestamp="2026-04-24 21:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:01.411844588 +0000 UTC m=+1772.652197848" watchObservedRunningTime="2026-04-24 21:56:01.414538056 +0000 UTC m=+1772.654891315" Apr 24 21:56:10.093177 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:10.093139 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:56:10.093790 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:10.093228 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:56:10.105721 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:10.105694 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:56:10.437555 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:10.437462 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:56:11.969713 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:11.969662 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk"] Apr 24 21:56:11.970103 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:11.969977 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="main" containerID="cri-o://69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2" gracePeriod=30 Apr 24 21:56:11.970103 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:11.970049 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="tokenizer" containerID="cri-o://bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d" gracePeriod=30 Apr 24 21:56:11.980267 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:11.980236 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8"] Apr 24 21:56:12.075588 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:56:12.075546 2569 logging.go:55] [core] [Channel #1114 SubChannel #1115]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.59:9003", ServerName: "10.133.0.59:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.59:9003: connect: connection refused" Apr 24 21:56:12.435573 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:12.435542 2569 generic.go:358] "Generic (PLEG): container finished" podID="587b900b-8654-41d8-9440-327de3a8f69a" containerID="69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2" exitCode=0 Apr 24 21:56:12.435774 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:12.435623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" event={"ID":"587b900b-8654-41d8-9440-327de3a8f69a","Type":"ContainerDied","Data":"69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2"} Apr 24 21:56:12.628299 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:12.628250 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 24 21:56:12.628483 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:12.628338 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs podName:2c3f787d-96f3-47a5-8442-71b04b5ba6f8 nodeName:}" failed. No retries permitted until 2026-04-24 21:56:13.12831762 +0000 UTC m=+1784.368670859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs") pod "scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 24 21:56:13.076008 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.075968 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.59:9003\" within 1s: context deadline exceeded" Apr 24 21:56:13.132465 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:13.132436 2569 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 24 21:56:13.132620 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:13.132515 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs podName:2c3f787d-96f3-47a5-8442-71b04b5ba6f8 nodeName:}" failed. No retries permitted until 2026-04-24 21:56:14.132500808 +0000 UTC m=+1785.372854046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs") pod "scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 24 21:56:13.231061 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.231040 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:56:13.334022 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.333945 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-uds\") pod \"587b900b-8654-41d8-9440-327de3a8f69a\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " Apr 24 21:56:13.334022 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.333983 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587b900b-8654-41d8-9440-327de3a8f69a-tls-certs\") pod \"587b900b-8654-41d8-9440-327de3a8f69a\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " Apr 24 21:56:13.334268 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334026 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnw6g\" (UniqueName: \"kubernetes.io/projected/587b900b-8654-41d8-9440-327de3a8f69a-kube-api-access-fnw6g\") pod \"587b900b-8654-41d8-9440-327de3a8f69a\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " Apr 24 21:56:13.334268 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334157 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-cache\") pod \"587b900b-8654-41d8-9440-327de3a8f69a\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " Apr 24 21:56:13.334268 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334210 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-tmp\") pod \"587b900b-8654-41d8-9440-327de3a8f69a\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " Apr 24 21:56:13.334268 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334234 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-kserve-provision-location\") pod \"587b900b-8654-41d8-9440-327de3a8f69a\" (UID: \"587b900b-8654-41d8-9440-327de3a8f69a\") " Apr 24 21:56:13.334268 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334245 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "587b900b-8654-41d8-9440-327de3a8f69a" (UID: "587b900b-8654-41d8-9440-327de3a8f69a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.334532 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334410 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "587b900b-8654-41d8-9440-327de3a8f69a" (UID: "587b900b-8654-41d8-9440-327de3a8f69a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.334583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334545 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.334583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334565 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.334583 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334564 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "587b900b-8654-41d8-9440-327de3a8f69a" (UID: "587b900b-8654-41d8-9440-327de3a8f69a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.334953 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.334933 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "587b900b-8654-41d8-9440-327de3a8f69a" (UID: "587b900b-8654-41d8-9440-327de3a8f69a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.336297 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.336270 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587b900b-8654-41d8-9440-327de3a8f69a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "587b900b-8654-41d8-9440-327de3a8f69a" (UID: "587b900b-8654-41d8-9440-327de3a8f69a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:13.336402 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.336295 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587b900b-8654-41d8-9440-327de3a8f69a-kube-api-access-fnw6g" (OuterVolumeSpecName: "kube-api-access-fnw6g") pod "587b900b-8654-41d8-9440-327de3a8f69a" (UID: "587b900b-8654-41d8-9440-327de3a8f69a"). InnerVolumeSpecName "kube-api-access-fnw6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:13.435136 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.435105 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587b900b-8654-41d8-9440-327de3a8f69a-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.435136 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.435131 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnw6g\" (UniqueName: \"kubernetes.io/projected/587b900b-8654-41d8-9440-327de3a8f69a-kube-api-access-fnw6g\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.435136 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.435141 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.435362 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.435150 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587b900b-8654-41d8-9440-327de3a8f69a-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.441482 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.441452 2569 generic.go:358] "Generic (PLEG): container finished" podID="587b900b-8654-41d8-9440-327de3a8f69a" containerID="bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d" exitCode=0 Apr 24 21:56:13.441631 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.441525 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" Apr 24 21:56:13.441631 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.441545 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" event={"ID":"587b900b-8654-41d8-9440-327de3a8f69a","Type":"ContainerDied","Data":"bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d"} Apr 24 21:56:13.441631 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.441590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk" event={"ID":"587b900b-8654-41d8-9440-327de3a8f69a","Type":"ContainerDied","Data":"50c3c3f3bb064bfa719d48d38f1f2146c00f13cf450b80f27a66ba2512604f9a"} Apr 24 21:56:13.441631 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.441611 2569 scope.go:117] "RemoveContainer" containerID="bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d" Apr 24 21:56:13.441991 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.441961 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" podUID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" containerName="main" containerID="cri-o://62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4" gracePeriod=30 Apr 24 21:56:13.450843 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.450824 2569 scope.go:117] "RemoveContainer" containerID="69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2" Apr 24 21:56:13.473057 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.473011 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk"] Apr 24 21:56:13.477779 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.477755 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdct5wrk"] Apr 24 21:56:13.499373 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.499342 2569 scope.go:117] "RemoveContainer" containerID="2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865" Apr 24 21:56:13.553390 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.553369 2569 scope.go:117] "RemoveContainer" containerID="bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d" Apr 24 21:56:13.553804 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:13.553782 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d\": container with ID starting with bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d not found: ID does not exist" containerID="bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d" Apr 24 21:56:13.553867 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.553816 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d"} err="failed to get container status \"bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d\": rpc error: code = NotFound desc = could not find container \"bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d\": container with ID starting with bcd4bef4175e82b4c4dd192520b1a5fa2d032962cbfcae3f7818122c25e3fc5d not found: ID does not exist" Apr 24 21:56:13.553867 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.553836 2569 scope.go:117] "RemoveContainer" containerID="69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2" Apr 24 21:56:13.554129 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:13.554110 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2\": container with ID starting with 69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2 not found: ID does not exist" containerID="69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2" Apr 24 21:56:13.554205 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.554153 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2"} err="failed to get container status \"69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2\": rpc error: code = NotFound desc = could not find container \"69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2\": container with ID starting with 69b40e420d6669499b3d7bb3ab47007fa51b8ac5cbb5ff512d243cbe74ba5aa2 not found: ID does not exist" Apr 24 21:56:13.554205 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.554179 2569 scope.go:117] "RemoveContainer" containerID="2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865" Apr 24 21:56:13.554437 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:13.554416 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865\": container with ID starting with 2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865 not found: ID does not exist" containerID="2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865" Apr 24 21:56:13.554534 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.554441 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865"} err="failed to get container status \"2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865\": rpc error: code = NotFound desc = could not find container \"2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865\": container with ID starting with 2b63638f71ef2903d9b1e97643038a3275d9b387da1af7564b9716961d48b865 not found: ID does not exist" Apr 24 21:56:13.689130 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.689108 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:56:13.839551 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839514 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-dshm\") pod \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " Apr 24 21:56:13.839746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839572 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tmp-dir\") pod \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " Apr 24 21:56:13.839746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839615 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h2mj\" (UniqueName: \"kubernetes.io/projected/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kube-api-access-7h2mj\") pod \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " Apr 24 21:56:13.839746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839642 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-model-cache\") pod \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " Apr 24 21:56:13.839746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839693 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kserve-provision-location\") pod \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " Apr 24 21:56:13.839972 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839746 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs\") pod \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " Apr 24 21:56:13.839972 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839793 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-home\") pod \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\" (UID: \"2c3f787d-96f3-47a5-8442-71b04b5ba6f8\") " Apr 24 21:56:13.839972 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839937 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-model-cache" (OuterVolumeSpecName: "model-cache") pod "2c3f787d-96f3-47a5-8442-71b04b5ba6f8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.840125 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.839946 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "2c3f787d-96f3-47a5-8442-71b04b5ba6f8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.840125 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.840078 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tmp-dir\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.840125 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.840098 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-model-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.840294 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.840134 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-home" (OuterVolumeSpecName: "home") pod "2c3f787d-96f3-47a5-8442-71b04b5ba6f8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.842017 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.841976 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2c3f787d-96f3-47a5-8442-71b04b5ba6f8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:13.842133 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.842029 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-dshm" (OuterVolumeSpecName: "dshm") pod "2c3f787d-96f3-47a5-8442-71b04b5ba6f8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.842133 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.842035 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kube-api-access-7h2mj" (OuterVolumeSpecName: "kube-api-access-7h2mj") pod "2c3f787d-96f3-47a5-8442-71b04b5ba6f8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8"). InnerVolumeSpecName "kube-api-access-7h2mj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:13.893895 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.893818 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2c3f787d-96f3-47a5-8442-71b04b5ba6f8" (UID: "2c3f787d-96f3-47a5-8442-71b04b5ba6f8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.941451 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.941417 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7h2mj\" (UniqueName: \"kubernetes.io/projected/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kube-api-access-7h2mj\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.941451 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.941444 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.941451 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.941454 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.941650 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.941465 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-home\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.941650 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:13.941474 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2c3f787d-96f3-47a5-8442-71b04b5ba6f8-dshm\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:56:14.447692 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.447633 2569 generic.go:358] "Generic (PLEG): container finished" podID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" containerID="62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4" exitCode=0 Apr 24 21:56:14.448110 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.447720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" event={"ID":"2c3f787d-96f3-47a5-8442-71b04b5ba6f8","Type":"ContainerDied","Data":"62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4"} Apr 24 21:56:14.448110 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.447755 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" event={"ID":"2c3f787d-96f3-47a5-8442-71b04b5ba6f8","Type":"ContainerDied","Data":"eaebdfaa112d0ff4b4fab01919900d0d0eebd7f7317cf9bfbd97cf42501d7425"} Apr 24 21:56:14.448110 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.447773 2569 scope.go:117] "RemoveContainer" containerID="62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4" Apr 24 21:56:14.448110 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.447729 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8" Apr 24 21:56:14.456562 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.456546 2569 scope.go:117] "RemoveContainer" containerID="e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c" Apr 24 21:56:14.471722 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.471704 2569 scope.go:117] "RemoveContainer" containerID="62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4" Apr 24 21:56:14.472006 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:14.471983 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4\": container with ID starting with 62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4 not found: ID does not exist" containerID="62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4" Apr 24 21:56:14.472066 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.472020 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4"} err="failed to get container status \"62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4\": rpc error: code = NotFound desc = could not find container \"62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4\": container with ID starting with 62dbaee91c6d9e42b174542afdecf2708c0d16774c2ef5c10efd6da94b0069d4 not found: ID does not exist" Apr 24 21:56:14.472066 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.472043 2569 scope.go:117] "RemoveContainer" containerID="e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c" Apr 24 21:56:14.472345 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:56:14.472314 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c\": container with ID starting with e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c not found: ID does not exist" containerID="e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c" Apr 24 21:56:14.472345 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.472338 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c"} err="failed to get container status \"e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c\": rpc error: code = NotFound desc = could not find container \"e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c\": container with ID starting with e6e112444e6b9906448d25b93bd766550d76c51fa331c5ef1fd106a7e12b8e5c not found: ID does not exist" Apr 24 21:56:14.473635 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.473613 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8"] Apr 24 21:56:14.478219 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:14.478197 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-67c65b665-vvzp8"] Apr 24 21:56:15.332963 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:15.332931 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" path="/var/lib/kubelet/pods/2c3f787d-96f3-47a5-8442-71b04b5ba6f8/volumes" Apr 24 21:56:15.333391 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:15.333378 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587b900b-8654-41d8-9440-327de3a8f69a" path="/var/lib/kubelet/pods/587b900b-8654-41d8-9440-327de3a8f69a/volumes" Apr 24 21:56:22.021414 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021331 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r"] Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021752 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="main" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021765 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="main" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021783 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="storage-initializer" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021790 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="storage-initializer" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021798 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="tokenizer" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021803 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="tokenizer" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021814 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" containerName="main" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021821 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" containerName="main" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021835 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" containerName="storage-initializer" Apr 24 21:56:22.021838 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021839 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" containerName="storage-initializer" Apr 24 21:56:22.022178 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021902 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c3f787d-96f3-47a5-8442-71b04b5ba6f8" containerName="main" Apr 24 21:56:22.022178 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021909 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="main" Apr 24 21:56:22.022178 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.021917 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="587b900b-8654-41d8-9440-327de3a8f69a" containerName="tokenizer" Apr 24 21:56:22.027094 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.027072 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.030776 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.030757 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 24 21:56:22.041634 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.041608 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r"] Apr 24 21:56:22.108147 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.108117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-model-cache\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.108324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.108184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-dshm\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.108324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.108215 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4303159-fb12-43df-9120-ade192c54c3e-tls-certs\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.108324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.108244 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-home\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.108324 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.108296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.108458 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.108336 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96mw8\" (UniqueName: \"kubernetes.io/projected/c4303159-fb12-43df-9120-ade192c54c3e-kube-api-access-96mw8\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.108458 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.108384 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-tmp-dir\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209216 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-model-cache\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209394 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-dshm\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209394 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4303159-fb12-43df-9120-ade192c54c3e-tls-certs\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209394 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-home\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209394 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209317 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209394 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96mw8\" (UniqueName: \"kubernetes.io/projected/c4303159-fb12-43df-9120-ade192c54c3e-kube-api-access-96mw8\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209394 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209374 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-tmp-dir\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-model-cache\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209710 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-tmp-dir\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209755 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209734 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.209874 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.209804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-home\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.211701 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.211659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-dshm\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.212027 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.212010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4303159-fb12-43df-9120-ade192c54c3e-tls-certs\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.219948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.219921 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96mw8\" (UniqueName: \"kubernetes.io/projected/c4303159-fb12-43df-9120-ade192c54c3e-kube-api-access-96mw8\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5l55r\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.336906 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.336825 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:22.470502 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.470478 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r"] Apr 24 21:56:22.472209 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:56:22.472180 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4303159_fb12_43df_9120_ade192c54c3e.slice/crio-c2c408f17e355b6251173890eb30265aef9cbdea981447413926fca61133fc06 WatchSource:0}: Error finding container c2c408f17e355b6251173890eb30265aef9cbdea981447413926fca61133fc06: Status 404 returned error can't find the container with id c2c408f17e355b6251173890eb30265aef9cbdea981447413926fca61133fc06 Apr 24 21:56:22.488460 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:22.488433 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" event={"ID":"c4303159-fb12-43df-9120-ade192c54c3e","Type":"ContainerStarted","Data":"c2c408f17e355b6251173890eb30265aef9cbdea981447413926fca61133fc06"} Apr 24 21:56:23.495187 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:23.495149 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" event={"ID":"c4303159-fb12-43df-9120-ade192c54c3e","Type":"ContainerStarted","Data":"27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346"} Apr 24 21:56:28.516088 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:28.516056 2569 generic.go:358] "Generic (PLEG): container finished" podID="c4303159-fb12-43df-9120-ade192c54c3e" containerID="27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346" exitCode=0 Apr 24 21:56:28.516456 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:28.516125 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" event={"ID":"c4303159-fb12-43df-9120-ade192c54c3e","Type":"ContainerDied","Data":"27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346"} Apr 24 21:56:29.439796 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:29.439772 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:56:29.444956 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:29.444936 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 21:56:29.521271 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:29.521243 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" event={"ID":"c4303159-fb12-43df-9120-ade192c54c3e","Type":"ContainerStarted","Data":"72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed"} Apr 24 21:56:29.544711 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:29.544644 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" podStartSLOduration=8.54463101 podStartE2EDuration="8.54463101s" podCreationTimestamp="2026-04-24 21:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:29.543258514 +0000 UTC m=+1800.783611774" watchObservedRunningTime="2026-04-24 21:56:29.54463101 +0000 UTC m=+1800.784984270" Apr 24 21:56:32.337746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:32.337702 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:32.337746 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:32.337751 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:32.350026 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:32.350004 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:56:32.543225 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:56:32.543193 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:57:08.325534 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.325498 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r"] Apr 24 21:57:08.326230 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.325823 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" podUID="c4303159-fb12-43df-9120-ade192c54c3e" containerName="main" containerID="cri-o://72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed" gracePeriod=30 Apr 24 21:57:08.571039 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.571012 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:57:08.611764 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.611682 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-kserve-provision-location\") pod \"c4303159-fb12-43df-9120-ade192c54c3e\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " Apr 24 21:57:08.611764 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.611744 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4303159-fb12-43df-9120-ade192c54c3e-tls-certs\") pod \"c4303159-fb12-43df-9120-ade192c54c3e\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " Apr 24 21:57:08.611947 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.611784 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-tmp-dir\") pod \"c4303159-fb12-43df-9120-ade192c54c3e\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " Apr 24 21:57:08.611947 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.611807 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96mw8\" (UniqueName: \"kubernetes.io/projected/c4303159-fb12-43df-9120-ade192c54c3e-kube-api-access-96mw8\") pod \"c4303159-fb12-43df-9120-ade192c54c3e\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " Apr 24 21:57:08.612041 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.612020 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-home\") pod \"c4303159-fb12-43df-9120-ade192c54c3e\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " Apr 24 21:57:08.612098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.612041 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "c4303159-fb12-43df-9120-ade192c54c3e" (UID: "c4303159-fb12-43df-9120-ade192c54c3e"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.612098 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.612078 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-model-cache\") pod \"c4303159-fb12-43df-9120-ade192c54c3e\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " Apr 24 21:57:08.612198 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.612107 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-dshm\") pod \"c4303159-fb12-43df-9120-ade192c54c3e\" (UID: \"c4303159-fb12-43df-9120-ade192c54c3e\") " Apr 24 21:57:08.612255 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.612240 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-home" (OuterVolumeSpecName: "home") pod "c4303159-fb12-43df-9120-ade192c54c3e" (UID: "c4303159-fb12-43df-9120-ade192c54c3e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.612457 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.612384 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-home\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.612457 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.612419 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-tmp-dir\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.612611 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.612562 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-model-cache" (OuterVolumeSpecName: "model-cache") pod "c4303159-fb12-43df-9120-ade192c54c3e" (UID: "c4303159-fb12-43df-9120-ade192c54c3e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.614433 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.614400 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-dshm" (OuterVolumeSpecName: "dshm") pod "c4303159-fb12-43df-9120-ade192c54c3e" (UID: "c4303159-fb12-43df-9120-ade192c54c3e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.614553 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.614507 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4303159-fb12-43df-9120-ade192c54c3e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c4303159-fb12-43df-9120-ade192c54c3e" (UID: "c4303159-fb12-43df-9120-ade192c54c3e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:57:08.614828 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.614801 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4303159-fb12-43df-9120-ade192c54c3e-kube-api-access-96mw8" (OuterVolumeSpecName: "kube-api-access-96mw8") pod "c4303159-fb12-43df-9120-ade192c54c3e" (UID: "c4303159-fb12-43df-9120-ade192c54c3e"). InnerVolumeSpecName "kube-api-access-96mw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:57:08.674472 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.674431 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c4303159-fb12-43df-9120-ade192c54c3e" (UID: "c4303159-fb12-43df-9120-ade192c54c3e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.677702 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.677649 2569 generic.go:358] "Generic (PLEG): container finished" podID="c4303159-fb12-43df-9120-ade192c54c3e" containerID="72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed" exitCode=0 Apr 24 21:57:08.677835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.677707 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" event={"ID":"c4303159-fb12-43df-9120-ade192c54c3e","Type":"ContainerDied","Data":"72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed"} Apr 24 21:57:08.677835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.677746 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" Apr 24 21:57:08.677835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.677761 2569 scope.go:117] "RemoveContainer" containerID="72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed" Apr 24 21:57:08.677973 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.677750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r" event={"ID":"c4303159-fb12-43df-9120-ade192c54c3e","Type":"ContainerDied","Data":"c2c408f17e355b6251173890eb30265aef9cbdea981447413926fca61133fc06"} Apr 24 21:57:08.687406 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.687389 2569 scope.go:117] "RemoveContainer" containerID="27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346" Apr 24 21:57:08.697301 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.697284 2569 scope.go:117] "RemoveContainer" containerID="72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed" Apr 24 21:57:08.697556 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:57:08.697537 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed\": container with ID starting with 72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed not found: ID does not exist" containerID="72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed" Apr 24 21:57:08.697614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.697565 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed"} err="failed to get container status \"72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed\": rpc error: code = NotFound desc = could not find container \"72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed\": container with ID starting with 72fe9c5d7e2cb99e1b08c4f2b3815de8cc7988b6faef28f17a60aba90f7879ed not found: ID does not exist" Apr 24 21:57:08.697614 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.697582 2569 scope.go:117] "RemoveContainer" containerID="27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346" Apr 24 21:57:08.697852 ip-10-0-133-73 kubenswrapper[2569]: E0424 21:57:08.697830 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346\": container with ID starting with 27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346 not found: ID does not exist" containerID="27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346" Apr 24 21:57:08.697948 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.697860 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346"} err="failed to get container status \"27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346\": rpc error: code = NotFound desc = could not find container \"27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346\": container with ID starting with 27ab75a4a4727a719a1570795353645f6b46fedab1868d211dd80f791ded7346 not found: ID does not exist" Apr 24 21:57:08.702722 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.702698 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r"] Apr 24 21:57:08.706493 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.706472 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5l55r"] Apr 24 21:57:08.712768 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.712751 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96mw8\" (UniqueName: \"kubernetes.io/projected/c4303159-fb12-43df-9120-ade192c54c3e-kube-api-access-96mw8\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.712835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.712770 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-model-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.712835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.712781 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-dshm\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.712835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.712789 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4303159-fb12-43df-9120-ade192c54c3e-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.712835 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:08.712798 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4303159-fb12-43df-9120-ade192c54c3e-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 21:57:09.334628 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:09.334598 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4303159-fb12-43df-9120-ade192c54c3e" path="/var/lib/kubelet/pods/c4303159-fb12-43df-9120-ade192c54c3e/volumes" Apr 24 21:57:16.762878 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.762848 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg"] Apr 24 21:57:16.763234 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.763207 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4303159-fb12-43df-9120-ade192c54c3e" containerName="storage-initializer" Apr 24 21:57:16.763234 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.763218 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4303159-fb12-43df-9120-ade192c54c3e" containerName="storage-initializer" Apr 24 21:57:16.763306 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.763239 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4303159-fb12-43df-9120-ade192c54c3e" containerName="main" Apr 24 21:57:16.763306 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.763244 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4303159-fb12-43df-9120-ade192c54c3e" containerName="main" Apr 24 21:57:16.763306 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.763296 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4303159-fb12-43df-9120-ade192c54c3e" containerName="main" Apr 24 21:57:16.769179 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.769156 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.773176 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.773148 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 21:57:16.773291 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.773186 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-2hhcg\"" Apr 24 21:57:16.778766 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.778743 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg"] Apr 24 21:57:16.874395 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.874363 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.874395 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.874401 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j687f\" (UniqueName: \"kubernetes.io/projected/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kube-api-access-j687f\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.874737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.874428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.874737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.874548 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.874737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.874603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.874737 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.874638 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975244 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975207 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975439 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975262 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975439 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975439 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975439 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j687f\" (UniqueName: \"kubernetes.io/projected/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kube-api-access-j687f\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975439 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975760 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975761 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.975816 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.975799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.978003 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.977985 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:16.985287 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:16.985263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j687f\" (UniqueName: \"kubernetes.io/projected/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kube-api-access-j687f\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:17.079240 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:17.079165 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:17.232982 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:17.232959 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg"] Apr 24 21:57:17.235438 ip-10-0-133-73 kubenswrapper[2569]: W0424 21:57:17.235406 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb226f7f7_b57b_48e5_9e0d_103b83933f9b.slice/crio-508ff95a1aafa16a9ee5c44bd2adde25a8e47a9e68d8954950972b600802e366 WatchSource:0}: Error finding container 508ff95a1aafa16a9ee5c44bd2adde25a8e47a9e68d8954950972b600802e366: Status 404 returned error can't find the container with id 508ff95a1aafa16a9ee5c44bd2adde25a8e47a9e68d8954950972b600802e366 Apr 24 21:57:17.714056 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:17.714023 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" event={"ID":"b226f7f7-b57b-48e5-9e0d-103b83933f9b","Type":"ContainerStarted","Data":"d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c"} Apr 24 21:57:17.714056 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:17.714062 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" event={"ID":"b226f7f7-b57b-48e5-9e0d-103b83933f9b","Type":"ContainerStarted","Data":"508ff95a1aafa16a9ee5c44bd2adde25a8e47a9e68d8954950972b600802e366"} Apr 24 21:57:18.719100 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:18.719063 2569 generic.go:358] "Generic (PLEG): container finished" podID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerID="d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c" exitCode=0 Apr 24 21:57:18.719482 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:18.719145 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" event={"ID":"b226f7f7-b57b-48e5-9e0d-103b83933f9b","Type":"ContainerDied","Data":"d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c"} Apr 24 21:57:19.725128 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:19.725085 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" event={"ID":"b226f7f7-b57b-48e5-9e0d-103b83933f9b","Type":"ContainerStarted","Data":"02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5"} Apr 24 21:57:19.725128 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:19.725133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" event={"ID":"b226f7f7-b57b-48e5-9e0d-103b83933f9b","Type":"ContainerStarted","Data":"86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2"} Apr 24 21:57:19.725652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:19.725250 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:19.748652 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:19.748603 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" podStartSLOduration=3.748590342 podStartE2EDuration="3.748590342s" podCreationTimestamp="2026-04-24 21:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:57:19.747312197 +0000 UTC m=+1850.987665451" watchObservedRunningTime="2026-04-24 21:57:19.748590342 +0000 UTC m=+1850.988943601" Apr 24 21:57:27.080284 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:27.080235 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:27.080868 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:27.080391 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:27.083213 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:27.083187 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:27.756601 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:27.756569 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 21:57:49.764767 ip-10-0-133-73 kubenswrapper[2569]: I0424 21:57:49.764725 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 22:01:05.667020 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:05.666814 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg"] Apr 24 22:01:05.668089 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:05.668051 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="main" containerID="cri-o://86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2" gracePeriod=30 Apr 24 22:01:05.668369 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:05.668117 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="tokenizer" containerID="cri-o://02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5" gracePeriod=30 Apr 24 22:01:06.619213 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:06.619111 2569 generic.go:358] "Generic (PLEG): container finished" podID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerID="86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2" exitCode=0 Apr 24 22:01:06.619213 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:06.619152 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" event={"ID":"b226f7f7-b57b-48e5-9e0d-103b83933f9b","Type":"ContainerDied","Data":"86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2"} Apr 24 22:01:06.924080 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:06.924056 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 22:01:07.019169 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019136 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-cache\") pod \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " Apr 24 22:01:07.019369 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019184 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-uds\") pod \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " Apr 24 22:01:07.019369 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019231 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j687f\" (UniqueName: \"kubernetes.io/projected/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kube-api-access-j687f\") pod \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " Apr 24 22:01:07.019369 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019304 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kserve-provision-location\") pod \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " Apr 24 22:01:07.019369 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019355 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tls-certs\") pod \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " Apr 24 22:01:07.019580 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019398 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-tmp\") pod \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\" (UID: \"b226f7f7-b57b-48e5-9e0d-103b83933f9b\") " Apr 24 22:01:07.019580 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019436 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b226f7f7-b57b-48e5-9e0d-103b83933f9b" (UID: "b226f7f7-b57b-48e5-9e0d-103b83933f9b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:07.019580 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019482 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b226f7f7-b57b-48e5-9e0d-103b83933f9b" (UID: "b226f7f7-b57b-48e5-9e0d-103b83933f9b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:07.019786 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019707 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:01:07.019786 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019725 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:01:07.019867 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.019789 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b226f7f7-b57b-48e5-9e0d-103b83933f9b" (UID: "b226f7f7-b57b-48e5-9e0d-103b83933f9b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:07.020139 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.020112 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b226f7f7-b57b-48e5-9e0d-103b83933f9b" (UID: "b226f7f7-b57b-48e5-9e0d-103b83933f9b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:07.021542 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.021522 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kube-api-access-j687f" (OuterVolumeSpecName: "kube-api-access-j687f") pod "b226f7f7-b57b-48e5-9e0d-103b83933f9b" (UID: "b226f7f7-b57b-48e5-9e0d-103b83933f9b"). InnerVolumeSpecName "kube-api-access-j687f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:01:07.021636 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.021614 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b226f7f7-b57b-48e5-9e0d-103b83933f9b" (UID: "b226f7f7-b57b-48e5-9e0d-103b83933f9b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:01:07.120773 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.120747 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j687f\" (UniqueName: \"kubernetes.io/projected/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kube-api-access-j687f\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:01:07.120773 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.120771 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:01:07.120939 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.120781 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:01:07.120939 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.120793 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b226f7f7-b57b-48e5-9e0d-103b83933f9b-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:01:07.625780 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.625697 2569 generic.go:358] "Generic (PLEG): container finished" podID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerID="02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5" exitCode=0 Apr 24 22:01:07.625780 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.625758 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" event={"ID":"b226f7f7-b57b-48e5-9e0d-103b83933f9b","Type":"ContainerDied","Data":"02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5"} Apr 24 22:01:07.625999 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.625799 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" event={"ID":"b226f7f7-b57b-48e5-9e0d-103b83933f9b","Type":"ContainerDied","Data":"508ff95a1aafa16a9ee5c44bd2adde25a8e47a9e68d8954950972b600802e366"} Apr 24 22:01:07.625999 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.625818 2569 scope.go:117] "RemoveContainer" containerID="02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5" Apr 24 22:01:07.625999 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.625764 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg" Apr 24 22:01:07.635212 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.635190 2569 scope.go:117] "RemoveContainer" containerID="86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2" Apr 24 22:01:07.643816 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.643628 2569 scope.go:117] "RemoveContainer" containerID="d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c" Apr 24 22:01:07.646947 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.646923 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg"] Apr 24 22:01:07.651245 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.651221 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-k8wzg"] Apr 24 22:01:07.653004 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.652978 2569 scope.go:117] "RemoveContainer" containerID="02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5" Apr 24 22:01:07.653331 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:01:07.653309 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5\": container with ID starting with 02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5 not found: ID does not exist" containerID="02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5" Apr 24 22:01:07.653387 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.653342 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5"} err="failed to get container status \"02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5\": rpc error: code = NotFound desc = could not find container \"02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5\": container with ID starting with 02182b255baf716be8050688d5855526b19f7e5e28176846bbcc64ee62f6fef5 not found: ID does not exist" Apr 24 22:01:07.653387 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.653363 2569 scope.go:117] "RemoveContainer" containerID="86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2" Apr 24 22:01:07.653623 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:01:07.653607 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2\": container with ID starting with 86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2 not found: ID does not exist" containerID="86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2" Apr 24 22:01:07.653698 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.653632 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2"} err="failed to get container status \"86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2\": rpc error: code = NotFound desc = could not find container \"86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2\": container with ID starting with 86cb0d7b17cd93eaa1c7da66bbb5f7736fb0fc24049fb7ca8fbbe9f0ff2c08e2 not found: ID does not exist" Apr 24 22:01:07.653698 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.653654 2569 scope.go:117] "RemoveContainer" containerID="d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c" Apr 24 22:01:07.653941 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:01:07.653922 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c\": container with ID starting with d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c not found: ID does not exist" containerID="d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c" Apr 24 22:01:07.653982 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:07.653948 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c"} err="failed to get container status \"d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c\": rpc error: code = NotFound desc = could not find container \"d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c\": container with ID starting with d17f9bde1fb105f2d6d01618b1cfb0da6916a6b16bd108d5734c57451f1b416c not found: ID does not exist" Apr 24 22:01:09.332583 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:09.332553 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" path="/var/lib/kubelet/pods/b226f7f7-b57b-48e5-9e0d-103b83933f9b/volumes" Apr 24 22:01:13.843827 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.843791 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq"] Apr 24 22:01:13.844190 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.844149 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="tokenizer" Apr 24 22:01:13.844190 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.844159 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="tokenizer" Apr 24 22:01:13.844190 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.844172 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="storage-initializer" Apr 24 22:01:13.844190 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.844177 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="storage-initializer" Apr 24 22:01:13.844190 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.844187 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="main" Apr 24 22:01:13.844190 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.844193 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="main" Apr 24 22:01:13.844400 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.844254 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="tokenizer" Apr 24 22:01:13.844400 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.844261 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b226f7f7-b57b-48e5-9e0d-103b83933f9b" containerName="main" Apr 24 22:01:13.847683 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.847656 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:13.850622 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.850600 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-6br6v\"" Apr 24 22:01:13.850922 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.850901 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 22:01:13.858554 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.858531 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq"] Apr 24 22:01:13.977083 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.977043 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:13.977267 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.977102 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:13.977267 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.977144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:13.977267 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.977170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec3f7c3-b605-4657-a669-ee773919abd2-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:13.977267 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.977225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnnr\" (UniqueName: \"kubernetes.io/projected/6ec3f7c3-b605-4657-a669-ee773919abd2-kube-api-access-llnnr\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:13.977440 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:13.977293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.078634 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.078593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.078634 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.078635 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec3f7c3-b605-4657-a669-ee773919abd2-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.078888 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.078654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llnnr\" (UniqueName: \"kubernetes.io/projected/6ec3f7c3-b605-4657-a669-ee773919abd2-kube-api-access-llnnr\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.078888 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.078708 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.078888 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.078802 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.078888 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.078853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.079084 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.079035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.079144 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.079126 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.079201 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.079148 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.079259 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.079234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.081319 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.081302 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec3f7c3-b605-4657-a669-ee773919abd2-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.088653 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.088616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnnr\" (UniqueName: \"kubernetes.io/projected/6ec3f7c3-b605-4657-a669-ee773919abd2-kube-api-access-llnnr\") pod \"stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.158571 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.158479 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:14.500604 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.500571 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq"] Apr 24 22:01:14.501935 ip-10-0-133-73 kubenswrapper[2569]: W0424 22:01:14.501903 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec3f7c3_b605_4657_a669_ee773919abd2.slice/crio-3c2cac2628f8f6f84bbe8a4557b54562ef861e112ba8b8c7439ca14a0f650e0d WatchSource:0}: Error finding container 3c2cac2628f8f6f84bbe8a4557b54562ef861e112ba8b8c7439ca14a0f650e0d: Status 404 returned error can't find the container with id 3c2cac2628f8f6f84bbe8a4557b54562ef861e112ba8b8c7439ca14a0f650e0d Apr 24 22:01:14.504426 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.504405 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:01:14.655086 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.655049 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" event={"ID":"6ec3f7c3-b605-4657-a669-ee773919abd2","Type":"ContainerStarted","Data":"d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f"} Apr 24 22:01:14.655086 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:14.655086 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" event={"ID":"6ec3f7c3-b605-4657-a669-ee773919abd2","Type":"ContainerStarted","Data":"3c2cac2628f8f6f84bbe8a4557b54562ef861e112ba8b8c7439ca14a0f650e0d"} Apr 24 22:01:15.660239 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:15.660194 2569 generic.go:358] "Generic (PLEG): container finished" podID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerID="d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f" exitCode=0 Apr 24 22:01:15.660662 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:15.660283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" event={"ID":"6ec3f7c3-b605-4657-a669-ee773919abd2","Type":"ContainerDied","Data":"d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f"} Apr 24 22:01:16.666613 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:16.666573 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" event={"ID":"6ec3f7c3-b605-4657-a669-ee773919abd2","Type":"ContainerStarted","Data":"3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89"} Apr 24 22:01:16.666613 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:16.666618 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" event={"ID":"6ec3f7c3-b605-4657-a669-ee773919abd2","Type":"ContainerStarted","Data":"4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c"} Apr 24 22:01:16.667080 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:16.666712 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:16.691795 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:16.691735 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" podStartSLOduration=3.691717859 podStartE2EDuration="3.691717859s" podCreationTimestamp="2026-04-24 22:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:01:16.68903676 +0000 UTC m=+2087.929390022" watchObservedRunningTime="2026-04-24 22:01:16.691717859 +0000 UTC m=+2087.932071119" Apr 24 22:01:24.158639 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:24.158586 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:24.159067 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:24.158801 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:24.161417 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:24.161393 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:24.703317 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:24.703288 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:01:29.470185 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:29.470153 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 22:01:29.476356 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:29.476328 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 22:01:46.711177 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:01:46.711143 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:04:45.088330 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:45.088298 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25"] Apr 24 22:04:45.088872 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:45.088614 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="main" containerID="cri-o://150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a" gracePeriod=30 Apr 24 22:04:45.088872 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:45.088730 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="tokenizer" containerID="cri-o://ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b" gracePeriod=30 Apr 24 22:04:45.497330 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:45.497223 2569 generic.go:358] "Generic (PLEG): container finished" podID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerID="150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a" exitCode=0 Apr 24 22:04:45.497330 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:45.497300 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" event={"ID":"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9","Type":"ContainerDied","Data":"150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a"} Apr 24 22:04:46.455943 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.455920 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 22:04:46.504323 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.504288 2569 generic.go:358] "Generic (PLEG): container finished" podID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerID="ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b" exitCode=0 Apr 24 22:04:46.504522 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.504364 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" Apr 24 22:04:46.504522 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.504413 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" event={"ID":"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9","Type":"ContainerDied","Data":"ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b"} Apr 24 22:04:46.504522 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.504454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25" event={"ID":"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9","Type":"ContainerDied","Data":"6f393a93423d7d08107775f6163cf9bac16418516a2f660c5121d47971bdd418"} Apr 24 22:04:46.504522 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.504474 2569 scope.go:117] "RemoveContainer" containerID="ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b" Apr 24 22:04:46.516339 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.516153 2569 scope.go:117] "RemoveContainer" containerID="150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a" Apr 24 22:04:46.525733 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.525708 2569 scope.go:117] "RemoveContainer" containerID="b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326" Apr 24 22:04:46.527121 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.526335 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-uds\") pod \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " Apr 24 22:04:46.527121 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.526398 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-tmp\") pod \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " Apr 24 22:04:46.527121 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.526437 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kserve-provision-location\") pod \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " Apr 24 22:04:46.527121 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.526506 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqprn\" (UniqueName: \"kubernetes.io/projected/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kube-api-access-hqprn\") pod \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " Apr 24 22:04:46.527121 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.526558 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-cache\") pod \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " Apr 24 22:04:46.527121 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.526608 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tls-certs\") pod \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\" (UID: \"3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9\") " Apr 24 22:04:46.527121 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.526622 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" (UID: "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:04:46.527121 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.526946 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:04:46.527588 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.527448 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" (UID: "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:04:46.527752 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.527649 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" (UID: "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:04:46.527988 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.527961 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" (UID: "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:04:46.529459 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.529435 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" (UID: "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:04:46.529653 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.529627 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kube-api-access-hqprn" (OuterVolumeSpecName: "kube-api-access-hqprn") pod "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" (UID: "3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9"). InnerVolumeSpecName "kube-api-access-hqprn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:04:46.548466 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.548443 2569 scope.go:117] "RemoveContainer" containerID="ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b" Apr 24 22:04:46.548768 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:04:46.548743 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b\": container with ID starting with ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b not found: ID does not exist" containerID="ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b" Apr 24 22:04:46.548848 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.548780 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b"} err="failed to get container status \"ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b\": rpc error: code = NotFound desc = could not find container \"ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b\": container with ID starting with ce50e42fef4235a746b655d9cb3495a135e7a688fe0e73e0da75f984028ad41b not found: ID does not exist" Apr 24 22:04:46.548848 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.548803 2569 scope.go:117] "RemoveContainer" containerID="150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a" Apr 24 22:04:46.549084 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:04:46.549062 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a\": container with ID starting with 150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a not found: ID does not exist" containerID="150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a" Apr 24 22:04:46.549140 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.549101 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a"} err="failed to get container status \"150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a\": rpc error: code = NotFound desc = could not find container \"150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a\": container with ID starting with 150f8cea1b97cec98087a65cd08d1611f880ef1014bc2ee7838db4ca46003d5a not found: ID does not exist" Apr 24 22:04:46.549140 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.549120 2569 scope.go:117] "RemoveContainer" containerID="b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326" Apr 24 22:04:46.549354 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:04:46.549336 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326\": container with ID starting with b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326 not found: ID does not exist" containerID="b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326" Apr 24 22:04:46.549411 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.549358 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326"} err="failed to get container status \"b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326\": rpc error: code = NotFound desc = could not find container \"b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326\": container with ID starting with b9cfa418a6bec9e791afe0226aae7419a6d04175cf91b5d3f5d7237f62483326 not found: ID does not exist" Apr 24 22:04:46.627780 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.627737 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:04:46.627780 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.627768 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:04:46.627780 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.627778 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hqprn\" (UniqueName: \"kubernetes.io/projected/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-kube-api-access-hqprn\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:04:46.627780 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.627788 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:04:46.628135 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.627798 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:04:46.832189 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.832153 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25"] Apr 24 22:04:46.835574 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:46.835546 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-64cc9hbs25"] Apr 24 22:04:47.333809 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:04:47.333777 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" path="/var/lib/kubelet/pods/3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9/volumes" Apr 24 22:05:00.975969 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.975931 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975"] Apr 24 22:05:00.976354 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.976302 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="main" Apr 24 22:05:00.976354 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.976313 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="main" Apr 24 22:05:00.976354 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.976320 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="tokenizer" Apr 24 22:05:00.976354 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.976325 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="tokenizer" Apr 24 22:05:00.976354 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.976340 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="storage-initializer" Apr 24 22:05:00.976354 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.976346 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="storage-initializer" Apr 24 22:05:00.976556 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.976415 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="main" Apr 24 22:05:00.976556 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.976427 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c2f79c5-ca8e-4133-bcd3-44e48b3d98f9" containerName="tokenizer" Apr 24 22:05:00.981392 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.981371 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:00.984317 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.984286 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-lmxhb\"" Apr 24 22:05:00.984472 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.984355 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 24 22:05:00.995371 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:00.995344 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975"] Apr 24 22:05:01.055778 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.055730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43d89866-e448-4bef-acd3-652f5dec57e2-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.055975 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.055790 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8szb\" (UniqueName: \"kubernetes.io/projected/43d89866-e448-4bef-acd3-652f5dec57e2-kube-api-access-p8szb\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.055975 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.055837 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.055975 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.055891 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.056136 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.055979 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.056136 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.056032 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.156658 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.156616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.156861 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.156707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.156861 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.156763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.156861 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.156819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.156861 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.156851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43d89866-e448-4bef-acd3-652f5dec57e2-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.157084 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.156898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8szb\" (UniqueName: \"kubernetes.io/projected/43d89866-e448-4bef-acd3-652f5dec57e2-kube-api-access-p8szb\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.157084 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.157074 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.157195 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.157087 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.157195 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.157182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.157984 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.157720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.166407 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.165301 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43d89866-e448-4bef-acd3-652f5dec57e2-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.167633 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.167605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8szb\" (UniqueName: \"kubernetes.io/projected/43d89866-e448-4bef-acd3-652f5dec57e2-kube-api-access-p8szb\") pod \"router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.290623 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.290579 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:01.427744 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.427712 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975"] Apr 24 22:05:01.429848 ip-10-0-133-73 kubenswrapper[2569]: W0424 22:05:01.429817 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43d89866_e448_4bef_acd3_652f5dec57e2.slice/crio-47392d0b44740970f2d162a425c8d4c971768d41e92a5d546639047ef7b9949a WatchSource:0}: Error finding container 47392d0b44740970f2d162a425c8d4c971768d41e92a5d546639047ef7b9949a: Status 404 returned error can't find the container with id 47392d0b44740970f2d162a425c8d4c971768d41e92a5d546639047ef7b9949a Apr 24 22:05:01.574305 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.574217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" event={"ID":"43d89866-e448-4bef-acd3-652f5dec57e2","Type":"ContainerStarted","Data":"22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5"} Apr 24 22:05:01.574305 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:01.574255 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" event={"ID":"43d89866-e448-4bef-acd3-652f5dec57e2","Type":"ContainerStarted","Data":"47392d0b44740970f2d162a425c8d4c971768d41e92a5d546639047ef7b9949a"} Apr 24 22:05:02.579613 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:02.579509 2569 generic.go:358] "Generic (PLEG): container finished" podID="43d89866-e448-4bef-acd3-652f5dec57e2" containerID="22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5" exitCode=0 Apr 24 22:05:02.579613 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:02.579559 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" event={"ID":"43d89866-e448-4bef-acd3-652f5dec57e2","Type":"ContainerDied","Data":"22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5"} Apr 24 22:05:03.588421 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:03.588382 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" event={"ID":"43d89866-e448-4bef-acd3-652f5dec57e2","Type":"ContainerStarted","Data":"8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780"} Apr 24 22:05:03.588421 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:03.588426 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" event={"ID":"43d89866-e448-4bef-acd3-652f5dec57e2","Type":"ContainerStarted","Data":"5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331"} Apr 24 22:05:03.589010 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:03.588549 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:03.612270 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:03.612217 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" podStartSLOduration=3.612202482 podStartE2EDuration="3.612202482s" podCreationTimestamp="2026-04-24 22:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:05:03.609301797 +0000 UTC m=+2314.849655060" watchObservedRunningTime="2026-04-24 22:05:03.612202482 +0000 UTC m=+2314.852555742" Apr 24 22:05:11.291018 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:11.290975 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:11.291018 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:11.291023 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:11.293844 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:11.293820 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:11.622282 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:11.622188 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:20.715063 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:20.714973 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq"] Apr 24 22:05:20.715610 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:20.715296 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="main" containerID="cri-o://4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c" gracePeriod=30 Apr 24 22:05:20.715610 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:20.715337 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="tokenizer" containerID="cri-o://3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89" gracePeriod=30 Apr 24 22:05:21.663682 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:21.663643 2569 generic.go:358] "Generic (PLEG): container finished" podID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerID="4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c" exitCode=0 Apr 24 22:05:21.663864 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:21.663704 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" event={"ID":"6ec3f7c3-b605-4657-a669-ee773919abd2","Type":"ContainerDied","Data":"4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c"} Apr 24 22:05:22.089282 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.089252 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:05:22.138312 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138274 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec3f7c3-b605-4657-a669-ee773919abd2-tls-certs\") pod \"6ec3f7c3-b605-4657-a669-ee773919abd2\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " Apr 24 22:05:22.138512 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138366 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-cache\") pod \"6ec3f7c3-b605-4657-a669-ee773919abd2\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " Apr 24 22:05:22.138512 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138398 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llnnr\" (UniqueName: \"kubernetes.io/projected/6ec3f7c3-b605-4657-a669-ee773919abd2-kube-api-access-llnnr\") pod \"6ec3f7c3-b605-4657-a669-ee773919abd2\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " Apr 24 22:05:22.138512 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138436 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-tmp\") pod \"6ec3f7c3-b605-4657-a669-ee773919abd2\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " Apr 24 22:05:22.138512 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138471 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-uds\") pod \"6ec3f7c3-b605-4657-a669-ee773919abd2\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " Apr 24 22:05:22.138512 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138498 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-kserve-provision-location\") pod \"6ec3f7c3-b605-4657-a669-ee773919abd2\" (UID: \"6ec3f7c3-b605-4657-a669-ee773919abd2\") " Apr 24 22:05:22.138804 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138735 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6ec3f7c3-b605-4657-a669-ee773919abd2" (UID: "6ec3f7c3-b605-4657-a669-ee773919abd2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:22.138844 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138826 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6ec3f7c3-b605-4657-a669-ee773919abd2" (UID: "6ec3f7c3-b605-4657-a669-ee773919abd2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:22.138884 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.138839 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6ec3f7c3-b605-4657-a669-ee773919abd2" (UID: "6ec3f7c3-b605-4657-a669-ee773919abd2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:22.139754 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.139719 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6ec3f7c3-b605-4657-a669-ee773919abd2" (UID: "6ec3f7c3-b605-4657-a669-ee773919abd2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:22.140902 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.140879 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec3f7c3-b605-4657-a669-ee773919abd2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6ec3f7c3-b605-4657-a669-ee773919abd2" (UID: "6ec3f7c3-b605-4657-a669-ee773919abd2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:22.140902 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.140891 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec3f7c3-b605-4657-a669-ee773919abd2-kube-api-access-llnnr" (OuterVolumeSpecName: "kube-api-access-llnnr") pod "6ec3f7c3-b605-4657-a669-ee773919abd2" (UID: "6ec3f7c3-b605-4657-a669-ee773919abd2"). InnerVolumeSpecName "kube-api-access-llnnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:22.239939 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.239847 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6ec3f7c3-b605-4657-a669-ee773919abd2-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:05:22.239939 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.239882 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:05:22.239939 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.239892 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llnnr\" (UniqueName: \"kubernetes.io/projected/6ec3f7c3-b605-4657-a669-ee773919abd2-kube-api-access-llnnr\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:05:22.239939 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.239901 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:05:22.239939 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.239910 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:05:22.239939 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.239921 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ec3f7c3-b605-4657-a669-ee773919abd2-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:05:22.669464 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.669422 2569 generic.go:358] "Generic (PLEG): container finished" podID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerID="3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89" exitCode=0 Apr 24 22:05:22.669640 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.669484 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" event={"ID":"6ec3f7c3-b605-4657-a669-ee773919abd2","Type":"ContainerDied","Data":"3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89"} Apr 24 22:05:22.669640 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.669501 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" Apr 24 22:05:22.669640 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.669518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq" event={"ID":"6ec3f7c3-b605-4657-a669-ee773919abd2","Type":"ContainerDied","Data":"3c2cac2628f8f6f84bbe8a4557b54562ef861e112ba8b8c7439ca14a0f650e0d"} Apr 24 22:05:22.669640 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.669539 2569 scope.go:117] "RemoveContainer" containerID="3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89" Apr 24 22:05:22.679300 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.679278 2569 scope.go:117] "RemoveContainer" containerID="4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c" Apr 24 22:05:22.687508 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.687490 2569 scope.go:117] "RemoveContainer" containerID="d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f" Apr 24 22:05:22.694218 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.694191 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq"] Apr 24 22:05:22.696851 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.696831 2569 scope.go:117] "RemoveContainer" containerID="3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89" Apr 24 22:05:22.697162 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:05:22.697143 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89\": container with ID starting with 3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89 not found: ID does not exist" containerID="3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89" Apr 24 22:05:22.697231 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.697173 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89"} err="failed to get container status \"3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89\": rpc error: code = NotFound desc = could not find container \"3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89\": container with ID starting with 3153555b7f138caea0edcc6c82c2c9f1f7114b40efe7545d107d6ca883b52f89 not found: ID does not exist" Apr 24 22:05:22.697231 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.697207 2569 scope.go:117] "RemoveContainer" containerID="4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c" Apr 24 22:05:22.697500 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:05:22.697476 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c\": container with ID starting with 4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c not found: ID does not exist" containerID="4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c" Apr 24 22:05:22.697541 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.697511 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c"} err="failed to get container status \"4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c\": rpc error: code = NotFound desc = could not find container \"4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c\": container with ID starting with 4204e5b61978f6b4139b1cadcc7dbe51652a0b442c26f37c696f43e53fae352c not found: ID does not exist" Apr 24 22:05:22.697541 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.697536 2569 scope.go:117] "RemoveContainer" containerID="d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f" Apr 24 22:05:22.697786 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:05:22.697761 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f\": container with ID starting with d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f not found: ID does not exist" containerID="d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f" Apr 24 22:05:22.697914 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.697789 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f"} err="failed to get container status \"d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f\": rpc error: code = NotFound desc = could not find container \"d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f\": container with ID starting with d699efc28d59cbe3af57117d3ae4bfd878c427e150c85ec9ccec53b1c63dd54f not found: ID does not exist" Apr 24 22:05:22.697914 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:22.697768 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6c8b9d6d94-mzbxq"] Apr 24 22:05:23.333159 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:23.333123 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" path="/var/lib/kubelet/pods/6ec3f7c3-b605-4657-a669-ee773919abd2/volumes" Apr 24 22:05:32.627020 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:32.626987 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:05:58.622854 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:58.622807 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-85ccfc4685-l5fj5"] Apr 24 22:05:58.623915 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:58.623085 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" podUID="27f39f5f-b64b-482b-b684-9c573028ee21" containerName="manager" containerID="cri-o://4bd950b83cb68c3dd99ce0b96f987e1aeac9d90b7aeb4d24b5d6aa9302eae8b1" gracePeriod=30 Apr 24 22:05:58.830306 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:58.830269 2569 generic.go:358] "Generic (PLEG): container finished" podID="27f39f5f-b64b-482b-b684-9c573028ee21" containerID="4bd950b83cb68c3dd99ce0b96f987e1aeac9d90b7aeb4d24b5d6aa9302eae8b1" exitCode=0 Apr 24 22:05:58.830513 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:58.830350 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" event={"ID":"27f39f5f-b64b-482b-b684-9c573028ee21","Type":"ContainerDied","Data":"4bd950b83cb68c3dd99ce0b96f987e1aeac9d90b7aeb4d24b5d6aa9302eae8b1"} Apr 24 22:05:58.884809 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:58.884736 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 22:05:59.057339 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.057296 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27f39f5f-b64b-482b-b684-9c573028ee21-cert\") pod \"27f39f5f-b64b-482b-b684-9c573028ee21\" (UID: \"27f39f5f-b64b-482b-b684-9c573028ee21\") " Apr 24 22:05:59.057533 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.057416 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7vmb\" (UniqueName: \"kubernetes.io/projected/27f39f5f-b64b-482b-b684-9c573028ee21-kube-api-access-h7vmb\") pod \"27f39f5f-b64b-482b-b684-9c573028ee21\" (UID: \"27f39f5f-b64b-482b-b684-9c573028ee21\") " Apr 24 22:05:59.059581 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.059544 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f39f5f-b64b-482b-b684-9c573028ee21-cert" (OuterVolumeSpecName: "cert") pod "27f39f5f-b64b-482b-b684-9c573028ee21" (UID: "27f39f5f-b64b-482b-b684-9c573028ee21"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:59.059729 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.059654 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f39f5f-b64b-482b-b684-9c573028ee21-kube-api-access-h7vmb" (OuterVolumeSpecName: "kube-api-access-h7vmb") pod "27f39f5f-b64b-482b-b684-9c573028ee21" (UID: "27f39f5f-b64b-482b-b684-9c573028ee21"). InnerVolumeSpecName "kube-api-access-h7vmb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:59.158365 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.158265 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7vmb\" (UniqueName: \"kubernetes.io/projected/27f39f5f-b64b-482b-b684-9c573028ee21-kube-api-access-h7vmb\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:05:59.158365 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.158302 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27f39f5f-b64b-482b-b684-9c573028ee21-cert\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:05:59.835317 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.835277 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" event={"ID":"27f39f5f-b64b-482b-b684-9c573028ee21","Type":"ContainerDied","Data":"5893b4bbbee9c8cdb88bd2a57d371df71d079a17daf490bac92c114a334e0c12"} Apr 24 22:05:59.835805 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.835331 2569 scope.go:117] "RemoveContainer" containerID="4bd950b83cb68c3dd99ce0b96f987e1aeac9d90b7aeb4d24b5d6aa9302eae8b1" Apr 24 22:05:59.835805 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.835335 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-85ccfc4685-l5fj5" Apr 24 22:05:59.854331 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.854292 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-85ccfc4685-l5fj5"] Apr 24 22:05:59.859401 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:05:59.859363 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-85ccfc4685-l5fj5"] Apr 24 22:06:01.333256 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:01.333222 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f39f5f-b64b-482b-b684-9c573028ee21" path="/var/lib/kubelet/pods/27f39f5f-b64b-482b-b684-9c573028ee21/volumes" Apr 24 22:06:29.501743 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:29.501694 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 22:06:29.509003 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:29.508972 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 22:06:34.384806 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.384766 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:06:34.385574 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385548 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="tokenizer" Apr 24 22:06:34.385574 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385573 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="tokenizer" Apr 24 22:06:34.385792 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385592 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27f39f5f-b64b-482b-b684-9c573028ee21" containerName="manager" Apr 24 22:06:34.385792 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385600 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f39f5f-b64b-482b-b684-9c573028ee21" containerName="manager" Apr 24 22:06:34.385792 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385650 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="storage-initializer" Apr 24 22:06:34.385792 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385684 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="storage-initializer" Apr 24 22:06:34.385792 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385709 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="main" Apr 24 22:06:34.385792 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385717 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="main" Apr 24 22:06:34.386091 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385825 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="main" Apr 24 22:06:34.386091 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385842 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ec3f7c3-b605-4657-a669-ee773919abd2" containerName="tokenizer" Apr 24 22:06:34.386091 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.385853 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="27f39f5f-b64b-482b-b684-9c573028ee21" containerName="manager" Apr 24 22:06:34.388896 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.388874 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.391923 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.391900 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 22:06:34.393224 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.393202 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-gnhtx\"" Apr 24 22:06:34.400932 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.400907 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:06:34.458156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.457716 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.458156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.457762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.458156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.457790 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.458156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.457835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.458156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.457879 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.458156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.457907 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4cfp\" (UniqueName: \"kubernetes.io/projected/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kube-api-access-m4cfp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.458156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.457956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.460855 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.460828 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276"] Apr 24 22:06:34.464218 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.464195 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.467114 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.467009 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-t2krb\"" Apr 24 22:06:34.479211 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.479184 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276"] Apr 24 22:06:34.559294 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.559294 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559294 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.559548 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.559548 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559459 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.559548 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559504 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.559548 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.559818 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559574 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.559818 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559639 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.559818 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559661 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.559818 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.559818 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.559818 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cfp\" (UniqueName: \"kubernetes.io/projected/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kube-api-access-m4cfp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.560097 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559832 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.560097 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76kft\" (UniqueName: \"kubernetes.io/projected/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kube-api-access-76kft\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.560097 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.560097 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.559909 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.560332 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.560308 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.561923 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.561907 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.561979 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.561947 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.573860 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.573833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cfp\" (UniqueName: \"kubernetes.io/projected/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kube-api-access-m4cfp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.660484 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.660394 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.660484 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.660432 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.660484 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.660460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.660805 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.660490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.660805 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.660519 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.660805 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.660558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76kft\" (UniqueName: \"kubernetes.io/projected/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kube-api-access-76kft\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.660940 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.660914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.660992 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.660940 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.661168 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.661144 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.661238 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.661179 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.663433 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.663401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.677982 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.677953 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76kft\" (UniqueName: \"kubernetes.io/projected/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kube-api-access-76kft\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.699980 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.699944 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:06:34.776519 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.776476 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:34.845170 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.845122 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:06:34.846301 ip-10-0-133-73 kubenswrapper[2569]: W0424 22:06:34.846258 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac6219c_2d7c_45b0_b363_8d1b6bb3cdab.slice/crio-4a6f8d64ff9a9e12fd71f642ff2550213ffafdeeda967e7d87510e2fc037bcfe WatchSource:0}: Error finding container 4a6f8d64ff9a9e12fd71f642ff2550213ffafdeeda967e7d87510e2fc037bcfe: Status 404 returned error can't find the container with id 4a6f8d64ff9a9e12fd71f642ff2550213ffafdeeda967e7d87510e2fc037bcfe Apr 24 22:06:34.849578 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.849555 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:06:34.946385 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.946356 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276"] Apr 24 22:06:34.948279 ip-10-0-133-73 kubenswrapper[2569]: W0424 22:06:34.948247 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc77b9224_ba6d_4e2e_80ab_8048ce7054e6.slice/crio-859e43cb0086de622b3cb51648aa2b4570860a732caa2cda86c09828fe9aa5ed WatchSource:0}: Error finding container 859e43cb0086de622b3cb51648aa2b4570860a732caa2cda86c09828fe9aa5ed: Status 404 returned error can't find the container with id 859e43cb0086de622b3cb51648aa2b4570860a732caa2cda86c09828fe9aa5ed Apr 24 22:06:34.977598 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.977560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" event={"ID":"c77b9224-ba6d-4e2e-80ab-8048ce7054e6","Type":"ContainerStarted","Data":"859e43cb0086de622b3cb51648aa2b4570860a732caa2cda86c09828fe9aa5ed"} Apr 24 22:06:34.979039 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.979009 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab","Type":"ContainerStarted","Data":"d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872"} Apr 24 22:06:34.979153 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:34.979048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab","Type":"ContainerStarted","Data":"4a6f8d64ff9a9e12fd71f642ff2550213ffafdeeda967e7d87510e2fc037bcfe"} Apr 24 22:06:35.985558 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:35.985524 2569 generic.go:358] "Generic (PLEG): container finished" podID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerID="e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb" exitCode=0 Apr 24 22:06:35.986000 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:35.985605 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" event={"ID":"c77b9224-ba6d-4e2e-80ab-8048ce7054e6","Type":"ContainerDied","Data":"e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb"} Apr 24 22:06:36.991073 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:36.991033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" event={"ID":"c77b9224-ba6d-4e2e-80ab-8048ce7054e6","Type":"ContainerStarted","Data":"2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37"} Apr 24 22:06:36.991073 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:36.991075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" event={"ID":"c77b9224-ba6d-4e2e-80ab-8048ce7054e6","Type":"ContainerStarted","Data":"771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a"} Apr 24 22:06:36.991602 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:36.991144 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:37.029799 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:37.029736 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" podStartSLOduration=3.029714256 podStartE2EDuration="3.029714256s" podCreationTimestamp="2026-04-24 22:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:06:37.024647886 +0000 UTC m=+2408.265001188" watchObservedRunningTime="2026-04-24 22:06:37.029714256 +0000 UTC m=+2408.270067518" Apr 24 22:06:44.776751 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:44.776709 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:44.777278 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:44.776763 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:44.779581 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:44.779551 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:06:45.035195 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:06:45.035113 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:07:06.040253 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:06.040204 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:07:42.843573 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:42.843533 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975"] Apr 24 22:07:42.844179 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:42.843954 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="main" containerID="cri-o://5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331" gracePeriod=30 Apr 24 22:07:42.844179 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:42.844023 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="tokenizer" containerID="cri-o://8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780" gracePeriod=30 Apr 24 22:07:43.274767 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:43.274735 2569 generic.go:358] "Generic (PLEG): container finished" podID="43d89866-e448-4bef-acd3-652f5dec57e2" containerID="5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331" exitCode=0 Apr 24 22:07:43.274940 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:43.274787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" event={"ID":"43d89866-e448-4bef-acd3-652f5dec57e2","Type":"ContainerDied","Data":"5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331"} Apr 24 22:07:44.091690 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.091650 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:07:44.181813 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.181735 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-kserve-provision-location\") pod \"43d89866-e448-4bef-acd3-652f5dec57e2\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " Apr 24 22:07:44.181813 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.181784 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-cache\") pod \"43d89866-e448-4bef-acd3-652f5dec57e2\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " Apr 24 22:07:44.182039 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.181823 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-tmp\") pod \"43d89866-e448-4bef-acd3-652f5dec57e2\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " Apr 24 22:07:44.182039 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.181845 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-uds\") pod \"43d89866-e448-4bef-acd3-652f5dec57e2\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " Apr 24 22:07:44.182039 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.181885 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8szb\" (UniqueName: \"kubernetes.io/projected/43d89866-e448-4bef-acd3-652f5dec57e2-kube-api-access-p8szb\") pod \"43d89866-e448-4bef-acd3-652f5dec57e2\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " Apr 24 22:07:44.182039 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.181942 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43d89866-e448-4bef-acd3-652f5dec57e2-tls-certs\") pod \"43d89866-e448-4bef-acd3-652f5dec57e2\" (UID: \"43d89866-e448-4bef-acd3-652f5dec57e2\") " Apr 24 22:07:44.182248 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.182039 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "43d89866-e448-4bef-acd3-652f5dec57e2" (UID: "43d89866-e448-4bef-acd3-652f5dec57e2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:07:44.182248 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.182145 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "43d89866-e448-4bef-acd3-652f5dec57e2" (UID: "43d89866-e448-4bef-acd3-652f5dec57e2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:07:44.182248 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.182199 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:07:44.182248 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.182196 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "43d89866-e448-4bef-acd3-652f5dec57e2" (UID: "43d89866-e448-4bef-acd3-652f5dec57e2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:07:44.182639 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.182577 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "43d89866-e448-4bef-acd3-652f5dec57e2" (UID: "43d89866-e448-4bef-acd3-652f5dec57e2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:07:44.184187 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.184160 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d89866-e448-4bef-acd3-652f5dec57e2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "43d89866-e448-4bef-acd3-652f5dec57e2" (UID: "43d89866-e448-4bef-acd3-652f5dec57e2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:07:44.184319 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.184299 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d89866-e448-4bef-acd3-652f5dec57e2-kube-api-access-p8szb" (OuterVolumeSpecName: "kube-api-access-p8szb") pod "43d89866-e448-4bef-acd3-652f5dec57e2" (UID: "43d89866-e448-4bef-acd3-652f5dec57e2"). InnerVolumeSpecName "kube-api-access-p8szb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:07:44.280359 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.280327 2569 generic.go:358] "Generic (PLEG): container finished" podID="43d89866-e448-4bef-acd3-652f5dec57e2" containerID="8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780" exitCode=0 Apr 24 22:07:44.280539 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.280406 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" Apr 24 22:07:44.280539 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.280410 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" event={"ID":"43d89866-e448-4bef-acd3-652f5dec57e2","Type":"ContainerDied","Data":"8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780"} Apr 24 22:07:44.280539 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.280453 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975" event={"ID":"43d89866-e448-4bef-acd3-652f5dec57e2","Type":"ContainerDied","Data":"47392d0b44740970f2d162a425c8d4c971768d41e92a5d546639047ef7b9949a"} Apr 24 22:07:44.280539 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.280473 2569 scope.go:117] "RemoveContainer" containerID="8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780" Apr 24 22:07:44.282659 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.282632 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8szb\" (UniqueName: \"kubernetes.io/projected/43d89866-e448-4bef-acd3-652f5dec57e2-kube-api-access-p8szb\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:07:44.282659 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.282657 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/43d89866-e448-4bef-acd3-652f5dec57e2-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:07:44.282857 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.282701 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:07:44.282857 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.282714 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:07:44.282857 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.282728 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/43d89866-e448-4bef-acd3-652f5dec57e2-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:07:44.292369 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.292348 2569 scope.go:117] "RemoveContainer" containerID="5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331" Apr 24 22:07:44.300465 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.300440 2569 scope.go:117] "RemoveContainer" containerID="22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5" Apr 24 22:07:44.303946 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.303925 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975"] Apr 24 22:07:44.308958 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.308933 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-759f9d68f5z975"] Apr 24 22:07:44.309492 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.309478 2569 scope.go:117] "RemoveContainer" containerID="8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780" Apr 24 22:07:44.309759 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:07:44.309741 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780\": container with ID starting with 8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780 not found: ID does not exist" containerID="8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780" Apr 24 22:07:44.309846 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.309765 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780"} err="failed to get container status \"8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780\": rpc error: code = NotFound desc = could not find container \"8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780\": container with ID starting with 8e0e45e48d9f9a4aa5fe6f7e07fbc3673b02c277a4b098bd61e29799217c2780 not found: ID does not exist" Apr 24 22:07:44.309846 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.309783 2569 scope.go:117] "RemoveContainer" containerID="5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331" Apr 24 22:07:44.310020 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:07:44.310000 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331\": container with ID starting with 5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331 not found: ID does not exist" containerID="5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331" Apr 24 22:07:44.310062 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.310026 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331"} err="failed to get container status \"5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331\": rpc error: code = NotFound desc = could not find container \"5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331\": container with ID starting with 5f37ece70ae82317d86d170be977d1b8f93e44787e6673a12bcb1f9967142331 not found: ID does not exist" Apr 24 22:07:44.310062 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.310044 2569 scope.go:117] "RemoveContainer" containerID="22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5" Apr 24 22:07:44.310274 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:07:44.310255 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5\": container with ID starting with 22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5 not found: ID does not exist" containerID="22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5" Apr 24 22:07:44.310313 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:44.310285 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5"} err="failed to get container status \"22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5\": rpc error: code = NotFound desc = could not find container \"22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5\": container with ID starting with 22f74dc1dc0ddf69992d599d843aa973959a4a0d42b2d601f0f2f63c76d166a5 not found: ID does not exist" Apr 24 22:07:45.332163 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:07:45.332128 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" path="/var/lib/kubelet/pods/43d89866-e448-4bef-acd3-652f5dec57e2/volumes" Apr 24 22:09:53.060117 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:53.060025 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276"] Apr 24 22:09:53.060648 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:53.060331 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="main" containerID="cri-o://771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a" gracePeriod=30 Apr 24 22:09:53.060648 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:53.060392 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="tokenizer" containerID="cri-o://2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37" gracePeriod=30 Apr 24 22:09:53.786377 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:53.786338 2569 generic.go:358] "Generic (PLEG): container finished" podID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerID="771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a" exitCode=0 Apr 24 22:09:53.786586 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:53.786404 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" event={"ID":"c77b9224-ba6d-4e2e-80ab-8048ce7054e6","Type":"ContainerDied","Data":"771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a"} Apr 24 22:09:54.308156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.308135 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:09:54.381982 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.381899 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tls-certs\") pod \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " Apr 24 22:09:54.381982 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.381959 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-cache\") pod \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " Apr 24 22:09:54.382207 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.381989 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76kft\" (UniqueName: \"kubernetes.io/projected/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kube-api-access-76kft\") pod \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " Apr 24 22:09:54.382207 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382055 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-tmp\") pod \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " Apr 24 22:09:54.382207 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382084 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-uds\") pod \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " Apr 24 22:09:54.382207 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382111 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kserve-provision-location\") pod \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\" (UID: \"c77b9224-ba6d-4e2e-80ab-8048ce7054e6\") " Apr 24 22:09:54.382393 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382275 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c77b9224-ba6d-4e2e-80ab-8048ce7054e6" (UID: "c77b9224-ba6d-4e2e-80ab-8048ce7054e6"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:54.382393 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382354 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c77b9224-ba6d-4e2e-80ab-8048ce7054e6" (UID: "c77b9224-ba6d-4e2e-80ab-8048ce7054e6"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:54.382498 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382395 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c77b9224-ba6d-4e2e-80ab-8048ce7054e6" (UID: "c77b9224-ba6d-4e2e-80ab-8048ce7054e6"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:54.382498 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382456 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.382498 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382468 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-tmp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.382498 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.382476 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tokenizer-uds\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.383196 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.383166 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c77b9224-ba6d-4e2e-80ab-8048ce7054e6" (UID: "c77b9224-ba6d-4e2e-80ab-8048ce7054e6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:54.384130 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.384115 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c77b9224-ba6d-4e2e-80ab-8048ce7054e6" (UID: "c77b9224-ba6d-4e2e-80ab-8048ce7054e6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:09:54.384243 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.384224 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kube-api-access-76kft" (OuterVolumeSpecName: "kube-api-access-76kft") pod "c77b9224-ba6d-4e2e-80ab-8048ce7054e6" (UID: "c77b9224-ba6d-4e2e-80ab-8048ce7054e6"). InnerVolumeSpecName "kube-api-access-76kft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:09:54.483365 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.483333 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-76kft\" (UniqueName: \"kubernetes.io/projected/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kube-api-access-76kft\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.483365 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.483359 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.483365 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.483369 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77b9224-ba6d-4e2e-80ab-8048ce7054e6-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.622499 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.622466 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:09:54.622770 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.622748 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" containerName="storage-initializer" containerID="cri-o://d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872" gracePeriod=30 Apr 24 22:09:54.792267 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.792233 2569 generic.go:358] "Generic (PLEG): container finished" podID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerID="2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37" exitCode=0 Apr 24 22:09:54.792439 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.792267 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" event={"ID":"c77b9224-ba6d-4e2e-80ab-8048ce7054e6","Type":"ContainerDied","Data":"2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37"} Apr 24 22:09:54.792439 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.792302 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" Apr 24 22:09:54.792439 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.792317 2569 scope.go:117] "RemoveContainer" containerID="2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37" Apr 24 22:09:54.792439 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.792307 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276" event={"ID":"c77b9224-ba6d-4e2e-80ab-8048ce7054e6","Type":"ContainerDied","Data":"859e43cb0086de622b3cb51648aa2b4570860a732caa2cda86c09828fe9aa5ed"} Apr 24 22:09:54.801839 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.801822 2569 scope.go:117] "RemoveContainer" containerID="771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a" Apr 24 22:09:54.809692 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.809652 2569 scope.go:117] "RemoveContainer" containerID="e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb" Apr 24 22:09:54.816688 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.816654 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276"] Apr 24 22:09:54.818213 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.818196 2569 scope.go:117] "RemoveContainer" containerID="2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37" Apr 24 22:09:54.818556 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:09:54.818532 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37\": container with ID starting with 2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37 not found: ID does not exist" containerID="2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37" Apr 24 22:09:54.818602 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.818569 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37"} err="failed to get container status \"2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37\": rpc error: code = NotFound desc = could not find container \"2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37\": container with ID starting with 2aa35e19220307663fb63d8d94e13c819ee5b26e868bdac9f991edf9d38e0e37 not found: ID does not exist" Apr 24 22:09:54.818602 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.818595 2569 scope.go:117] "RemoveContainer" containerID="771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a" Apr 24 22:09:54.818929 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:09:54.818909 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a\": container with ID starting with 771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a not found: ID does not exist" containerID="771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a" Apr 24 22:09:54.818992 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.818935 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a"} err="failed to get container status \"771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a\": rpc error: code = NotFound desc = could not find container \"771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a\": container with ID starting with 771e326c7df87daf2d017a009d49400edb085d957ed83e7be620bb7d923fb36a not found: ID does not exist" Apr 24 22:09:54.818992 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.818954 2569 scope.go:117] "RemoveContainer" containerID="e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb" Apr 24 22:09:54.819195 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:09:54.819174 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb\": container with ID starting with e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb not found: ID does not exist" containerID="e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb" Apr 24 22:09:54.819246 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.819205 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb"} err="failed to get container status \"e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb\": rpc error: code = NotFound desc = could not find container \"e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb\": container with ID starting with e6d3bd253e3021ed9917587834dedb4a9cadbba87fb583fa387cdad3a04bd2cb not found: ID does not exist" Apr 24 22:09:54.820397 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:54.820380 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche76276"] Apr 24 22:09:55.333316 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:09:55.333280 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" path="/var/lib/kubelet/pods/c77b9224-ba6d-4e2e-80ab-8048ce7054e6/volumes" Apr 24 22:10:24.815510 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.815487 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab/storage-initializer/0.log" Apr 24 22:10:24.815806 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.815552 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:10:24.909486 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.909406 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab/storage-initializer/0.log" Apr 24 22:10:24.909486 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.909456 2569 generic.go:358] "Generic (PLEG): container finished" podID="8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" containerID="d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872" exitCode=137 Apr 24 22:10:24.909719 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.909545 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 22:10:24.909719 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.909548 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab","Type":"ContainerDied","Data":"d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872"} Apr 24 22:10:24.909719 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.909632 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab","Type":"ContainerDied","Data":"4a6f8d64ff9a9e12fd71f642ff2550213ffafdeeda967e7d87510e2fc037bcfe"} Apr 24 22:10:24.909719 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.909651 2569 scope.go:117] "RemoveContainer" containerID="d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872" Apr 24 22:10:24.935005 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.934977 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-model-cache\") pod \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " Apr 24 22:10:24.935153 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935076 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kserve-provision-location\") pod \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " Apr 24 22:10:24.935153 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935128 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tls-certs\") pod \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " Apr 24 22:10:24.935153 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935150 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-dshm\") pod \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " Apr 24 22:10:24.935312 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935206 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tmp-dir\") pod \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " Apr 24 22:10:24.935312 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935238 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-home\") pod \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " Apr 24 22:10:24.935312 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935254 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-model-cache" (OuterVolumeSpecName: "model-cache") pod "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" (UID: "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:24.935312 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935265 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4cfp\" (UniqueName: \"kubernetes.io/projected/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kube-api-access-m4cfp\") pod \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\" (UID: \"8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab\") " Apr 24 22:10:24.935610 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935591 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-model-cache\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:10:24.935697 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935588 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" (UID: "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:24.935772 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.935749 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-home" (OuterVolumeSpecName: "home") pod "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" (UID: "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:24.937460 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.937435 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" (UID: "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:10:24.937716 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.937658 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-dshm" (OuterVolumeSpecName: "dshm") pod "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" (UID: "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:24.937829 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.937735 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kube-api-access-m4cfp" (OuterVolumeSpecName: "kube-api-access-m4cfp") pod "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" (UID: "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab"). InnerVolumeSpecName "kube-api-access-m4cfp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:10:24.939849 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.939824 2569 scope.go:117] "RemoveContainer" containerID="d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872" Apr 24 22:10:24.940146 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:10:24.940126 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872\": container with ID starting with d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872 not found: ID does not exist" containerID="d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872" Apr 24 22:10:24.940213 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.940158 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872"} err="failed to get container status \"d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872\": rpc error: code = NotFound desc = could not find container \"d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872\": container with ID starting with d350c632cb070c63e2b34118d1e94cb86b76bb67a3c80292eeff138252c80872 not found: ID does not exist" Apr 24 22:10:24.954278 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:24.954250 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" (UID: "8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:25.036881 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.036855 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kserve-provision-location\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:10:25.036881 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.036878 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tls-certs\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:10:25.037058 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.036891 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-dshm\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:10:25.037058 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.036899 2569 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-tmp-dir\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:10:25.037058 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.036907 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-home\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:10:25.037058 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.036915 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m4cfp\" (UniqueName: \"kubernetes.io/projected/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab-kube-api-access-m4cfp\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:10:25.251404 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.251373 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:10:25.255169 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.255135 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 22:10:25.336281 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:10:25.336246 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" path="/var/lib/kubelet/pods/8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab/volumes" Apr 24 22:11:29.540792 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:11:29.540761 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 22:11:29.547138 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:11:29.547116 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 22:14:14.189620 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.189578 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rx77n/must-gather-bfgwt"] Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.189977 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" containerName="storage-initializer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.189995 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" containerName="storage-initializer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190019 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="main" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190028 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="main" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190044 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="storage-initializer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190050 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="storage-initializer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190059 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="storage-initializer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190064 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="storage-initializer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190072 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="tokenizer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190077 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="tokenizer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190090 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="main" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190095 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="main" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190112 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="tokenizer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190121 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="tokenizer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190206 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="main" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190215 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="main" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190222 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="43d89866-e448-4bef-acd3-652f5dec57e2" containerName="tokenizer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190227 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c77b9224-ba6d-4e2e-80ab-8048ce7054e6" containerName="tokenizer" Apr 24 22:14:14.192126 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.190235 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ac6219c-2d7c-45b0-b363-8d1b6bb3cdab" containerName="storage-initializer" Apr 24 22:14:14.193110 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.193094 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:14.195865 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.195828 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rx77n\"/\"openshift-service-ca.crt\"" Apr 24 22:14:14.196029 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.195936 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rx77n\"/\"default-dockercfg-g6fh6\"" Apr 24 22:14:14.197244 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.197227 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rx77n\"/\"kube-root-ca.crt\"" Apr 24 22:14:14.200760 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.200734 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rx77n/must-gather-bfgwt"] Apr 24 22:14:14.327259 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.327213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxmcq\" (UniqueName: \"kubernetes.io/projected/f33db989-5c93-455c-b1e8-e18720c7b480-kube-api-access-pxmcq\") pod \"must-gather-bfgwt\" (UID: \"f33db989-5c93-455c-b1e8-e18720c7b480\") " pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:14.327452 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.327312 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f33db989-5c93-455c-b1e8-e18720c7b480-must-gather-output\") pod \"must-gather-bfgwt\" (UID: \"f33db989-5c93-455c-b1e8-e18720c7b480\") " pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:14.428765 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.428723 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f33db989-5c93-455c-b1e8-e18720c7b480-must-gather-output\") pod \"must-gather-bfgwt\" (UID: \"f33db989-5c93-455c-b1e8-e18720c7b480\") " pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:14.428765 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.428776 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxmcq\" (UniqueName: \"kubernetes.io/projected/f33db989-5c93-455c-b1e8-e18720c7b480-kube-api-access-pxmcq\") pod \"must-gather-bfgwt\" (UID: \"f33db989-5c93-455c-b1e8-e18720c7b480\") " pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:14.429105 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.429085 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f33db989-5c93-455c-b1e8-e18720c7b480-must-gather-output\") pod \"must-gather-bfgwt\" (UID: \"f33db989-5c93-455c-b1e8-e18720c7b480\") " pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:14.442023 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.441959 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxmcq\" (UniqueName: \"kubernetes.io/projected/f33db989-5c93-455c-b1e8-e18720c7b480-kube-api-access-pxmcq\") pod \"must-gather-bfgwt\" (UID: \"f33db989-5c93-455c-b1e8-e18720c7b480\") " pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:14.503381 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.503349 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:14.634086 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.634057 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rx77n/must-gather-bfgwt"] Apr 24 22:14:14.635695 ip-10-0-133-73 kubenswrapper[2569]: W0424 22:14:14.635647 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf33db989_5c93_455c_b1e8_e18720c7b480.slice/crio-648d85914029c5ee59dab55bbf7d11e5dd91a2866ab579a4b451ee123e95c2f9 WatchSource:0}: Error finding container 648d85914029c5ee59dab55bbf7d11e5dd91a2866ab579a4b451ee123e95c2f9: Status 404 returned error can't find the container with id 648d85914029c5ee59dab55bbf7d11e5dd91a2866ab579a4b451ee123e95c2f9 Apr 24 22:14:14.637348 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.637330 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:14:14.794368 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:14.794330 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rx77n/must-gather-bfgwt" event={"ID":"f33db989-5c93-455c-b1e8-e18720c7b480","Type":"ContainerStarted","Data":"648d85914029c5ee59dab55bbf7d11e5dd91a2866ab579a4b451ee123e95c2f9"} Apr 24 22:14:18.815962 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:18.815918 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rx77n/must-gather-bfgwt" event={"ID":"f33db989-5c93-455c-b1e8-e18720c7b480","Type":"ContainerStarted","Data":"a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219"} Apr 24 22:14:19.822587 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:19.822550 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rx77n/must-gather-bfgwt" event={"ID":"f33db989-5c93-455c-b1e8-e18720c7b480","Type":"ContainerStarted","Data":"e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753"} Apr 24 22:14:19.840824 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:19.840753 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rx77n/must-gather-bfgwt" podStartSLOduration=1.820786194 podStartE2EDuration="5.840732336s" podCreationTimestamp="2026-04-24 22:14:14 +0000 UTC" firstStartedPulling="2026-04-24 22:14:14.637461772 +0000 UTC m=+2865.877815011" lastFinishedPulling="2026-04-24 22:14:18.65740791 +0000 UTC m=+2869.897761153" observedRunningTime="2026-04-24 22:14:19.839233092 +0000 UTC m=+2871.079586346" watchObservedRunningTime="2026-04-24 22:14:19.840732336 +0000 UTC m=+2871.081085598" Apr 24 22:14:28.280059 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:28.280029 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:29.270186 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:29.270146 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:30.279409 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:30.279356 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:31.268642 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:31.268610 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:32.242613 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:32.242584 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:33.224980 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:33.224950 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:34.181548 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:34.181518 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:35.140088 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:35.140055 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:36.108351 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:36.108320 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:37.054750 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:37.054714 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:38.027295 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:38.027266 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:38.994791 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:38.994765 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:40.015013 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:40.014984 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:41.043875 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:41.043843 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-b9jxx_7321a418-5367-453b-83e2-4f814b7bfcb0/istio-proxy/0.log" Apr 24 22:14:42.083423 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:42.083379 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-5wn4v_aa08ecbd-1f58-4d2c-a231-3db5090bb227/istio-proxy/0.log" Apr 24 22:14:42.919455 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:42.919424 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-5wn4v_aa08ecbd-1f58-4d2c-a231-3db5090bb227/istio-proxy/0.log" Apr 24 22:14:43.713690 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:43.713637 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-cn95g_aaedb57f-ea16-416f-9886-ed7e3462f547/manager/0.log" Apr 24 22:14:43.727286 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:43.727262 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-b9b2h_38cd02a3-c2aa-46cd-b870-a44a5cd71fe9/manager/0.log" Apr 24 22:14:43.792768 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:43.792731 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-nw8j6_04c606e3-591e-4913-844a-50bbb025eaad/limitador/0.log" Apr 24 22:14:43.808101 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:43.808078 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-499d9_16fb4c1f-74d0-4dd7-a780-5de816a3d86d/manager/0.log" Apr 24 22:14:44.932316 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:44.932283 2569 generic.go:358] "Generic (PLEG): container finished" podID="f33db989-5c93-455c-b1e8-e18720c7b480" containerID="a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219" exitCode=0 Apr 24 22:14:44.932740 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:44.932348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rx77n/must-gather-bfgwt" event={"ID":"f33db989-5c93-455c-b1e8-e18720c7b480","Type":"ContainerDied","Data":"a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219"} Apr 24 22:14:44.932740 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:44.932706 2569 scope.go:117] "RemoveContainer" containerID="a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219" Apr 24 22:14:45.520395 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:45.520346 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rx77n_must-gather-bfgwt_f33db989-5c93-455c-b1e8-e18720c7b480/gather/0.log" Apr 24 22:14:46.194404 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.194366 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lj9kt/must-gather-tqnfx"] Apr 24 22:14:46.199202 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.199178 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj9kt/must-gather-tqnfx" Apr 24 22:14:46.201788 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.201770 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lj9kt\"/\"openshift-service-ca.crt\"" Apr 24 22:14:46.201926 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.201902 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lj9kt\"/\"kube-root-ca.crt\"" Apr 24 22:14:46.203010 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.202992 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lj9kt\"/\"default-dockercfg-w6nqf\"" Apr 24 22:14:46.208728 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.208704 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lj9kt/must-gather-tqnfx"] Apr 24 22:14:46.313385 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.313348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90ab34d9-fbcb-4b20-93ee-6e657ef1e57f-must-gather-output\") pod \"must-gather-tqnfx\" (UID: \"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f\") " pod="openshift-must-gather-lj9kt/must-gather-tqnfx" Apr 24 22:14:46.313588 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.313427 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xb4r\" (UniqueName: \"kubernetes.io/projected/90ab34d9-fbcb-4b20-93ee-6e657ef1e57f-kube-api-access-6xb4r\") pod \"must-gather-tqnfx\" (UID: \"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f\") " pod="openshift-must-gather-lj9kt/must-gather-tqnfx" Apr 24 22:14:46.413840 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.413800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xb4r\" (UniqueName: \"kubernetes.io/projected/90ab34d9-fbcb-4b20-93ee-6e657ef1e57f-kube-api-access-6xb4r\") pod \"must-gather-tqnfx\" (UID: \"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f\") " pod="openshift-must-gather-lj9kt/must-gather-tqnfx" Apr 24 22:14:46.414025 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.413868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90ab34d9-fbcb-4b20-93ee-6e657ef1e57f-must-gather-output\") pod \"must-gather-tqnfx\" (UID: \"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f\") " pod="openshift-must-gather-lj9kt/must-gather-tqnfx" Apr 24 22:14:46.414185 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.414171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90ab34d9-fbcb-4b20-93ee-6e657ef1e57f-must-gather-output\") pod \"must-gather-tqnfx\" (UID: \"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f\") " pod="openshift-must-gather-lj9kt/must-gather-tqnfx" Apr 24 22:14:46.426947 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.426918 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xb4r\" (UniqueName: \"kubernetes.io/projected/90ab34d9-fbcb-4b20-93ee-6e657ef1e57f-kube-api-access-6xb4r\") pod \"must-gather-tqnfx\" (UID: \"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f\") " pod="openshift-must-gather-lj9kt/must-gather-tqnfx" Apr 24 22:14:46.509739 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.509651 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj9kt/must-gather-tqnfx" Apr 24 22:14:46.641271 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.641245 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lj9kt/must-gather-tqnfx"] Apr 24 22:14:46.643273 ip-10-0-133-73 kubenswrapper[2569]: W0424 22:14:46.643235 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90ab34d9_fbcb_4b20_93ee_6e657ef1e57f.slice/crio-7504c52a3bad2e8c7054ca64add2024cdd26d0df7c0a7ac208ea59290d2e5ad0 WatchSource:0}: Error finding container 7504c52a3bad2e8c7054ca64add2024cdd26d0df7c0a7ac208ea59290d2e5ad0: Status 404 returned error can't find the container with id 7504c52a3bad2e8c7054ca64add2024cdd26d0df7c0a7ac208ea59290d2e5ad0 Apr 24 22:14:46.942545 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:46.942462 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj9kt/must-gather-tqnfx" event={"ID":"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f","Type":"ContainerStarted","Data":"7504c52a3bad2e8c7054ca64add2024cdd26d0df7c0a7ac208ea59290d2e5ad0"} Apr 24 22:14:47.950483 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:47.950393 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj9kt/must-gather-tqnfx" event={"ID":"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f","Type":"ContainerStarted","Data":"a4b901b2fbeafdef58592444ef4413a1f802d3de4fd654aa8acac7a018df1da2"} Apr 24 22:14:47.951843 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:47.951816 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj9kt/must-gather-tqnfx" event={"ID":"90ab34d9-fbcb-4b20-93ee-6e657ef1e57f","Type":"ContainerStarted","Data":"ec821504887cabe77c2eeca3b1197b8e8deb7d85e9310adf8408c5262172faf5"} Apr 24 22:14:47.971604 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:47.971542 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lj9kt/must-gather-tqnfx" podStartSLOduration=1.208359346 podStartE2EDuration="1.971524439s" podCreationTimestamp="2026-04-24 22:14:46 +0000 UTC" firstStartedPulling="2026-04-24 22:14:46.645038097 +0000 UTC m=+2897.885391335" lastFinishedPulling="2026-04-24 22:14:47.408203181 +0000 UTC m=+2898.648556428" observedRunningTime="2026-04-24 22:14:47.970129002 +0000 UTC m=+2899.210482286" watchObservedRunningTime="2026-04-24 22:14:47.971524439 +0000 UTC m=+2899.211877701" Apr 24 22:14:48.959645 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:48.959615 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9zcwj_8179f9fe-0d63-49d2-84df-a9763b98a8c6/global-pull-secret-syncer/0.log" Apr 24 22:14:49.074701 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:49.074630 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6bv5v_eeb650ea-cab6-4757-8b65-a0b656f23baf/konnectivity-agent/0.log" Apr 24 22:14:49.219183 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:49.219087 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-73.ec2.internal_d74e5b7f3ca859862ed6413284694748/haproxy/0.log" Apr 24 22:14:51.035285 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.035229 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rx77n/must-gather-bfgwt"] Apr 24 22:14:51.035901 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.035555 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rx77n/must-gather-bfgwt" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" containerName="copy" containerID="cri-o://e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753" gracePeriod=2 Apr 24 22:14:51.038295 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.038256 2569 status_manager.go:895] "Failed to get status for pod" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" pod="openshift-must-gather-rx77n/must-gather-bfgwt" err="pods \"must-gather-bfgwt\" is forbidden: User \"system:node:ip-10-0-133-73.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rx77n\": no relationship found between node 'ip-10-0-133-73.ec2.internal' and this object" Apr 24 22:14:51.041388 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.041363 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rx77n/must-gather-bfgwt"] Apr 24 22:14:51.418711 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.418455 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rx77n_must-gather-bfgwt_f33db989-5c93-455c-b1e8-e18720c7b480/copy/0.log" Apr 24 22:14:51.418973 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.418915 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:51.490904 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.490176 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f33db989-5c93-455c-b1e8-e18720c7b480-must-gather-output\") pod \"f33db989-5c93-455c-b1e8-e18720c7b480\" (UID: \"f33db989-5c93-455c-b1e8-e18720c7b480\") " Apr 24 22:14:51.490904 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.490272 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxmcq\" (UniqueName: \"kubernetes.io/projected/f33db989-5c93-455c-b1e8-e18720c7b480-kube-api-access-pxmcq\") pod \"f33db989-5c93-455c-b1e8-e18720c7b480\" (UID: \"f33db989-5c93-455c-b1e8-e18720c7b480\") " Apr 24 22:14:51.494489 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.494425 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33db989-5c93-455c-b1e8-e18720c7b480-kube-api-access-pxmcq" (OuterVolumeSpecName: "kube-api-access-pxmcq") pod "f33db989-5c93-455c-b1e8-e18720c7b480" (UID: "f33db989-5c93-455c-b1e8-e18720c7b480"). InnerVolumeSpecName "kube-api-access-pxmcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:14:51.501288 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.501253 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33db989-5c93-455c-b1e8-e18720c7b480-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f33db989-5c93-455c-b1e8-e18720c7b480" (UID: "f33db989-5c93-455c-b1e8-e18720c7b480"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:51.591599 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.591530 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxmcq\" (UniqueName: \"kubernetes.io/projected/f33db989-5c93-455c-b1e8-e18720c7b480-kube-api-access-pxmcq\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:14:51.591599 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.591569 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f33db989-5c93-455c-b1e8-e18720c7b480-must-gather-output\") on node \"ip-10-0-133-73.ec2.internal\" DevicePath \"\"" Apr 24 22:14:51.971638 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.971607 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rx77n_must-gather-bfgwt_f33db989-5c93-455c-b1e8-e18720c7b480/copy/0.log" Apr 24 22:14:51.975235 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.972309 2569 generic.go:358] "Generic (PLEG): container finished" podID="f33db989-5c93-455c-b1e8-e18720c7b480" containerID="e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753" exitCode=143 Apr 24 22:14:51.975235 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.972453 2569 scope.go:117] "RemoveContainer" containerID="e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753" Apr 24 22:14:51.975235 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.972593 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rx77n/must-gather-bfgwt" Apr 24 22:14:51.989989 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:51.989961 2569 scope.go:117] "RemoveContainer" containerID="a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219" Apr 24 22:14:52.010705 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:52.010661 2569 scope.go:117] "RemoveContainer" containerID="e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753" Apr 24 22:14:52.011096 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:14:52.011065 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753\": container with ID starting with e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753 not found: ID does not exist" containerID="e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753" Apr 24 22:14:52.011193 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:52.011108 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753"} err="failed to get container status \"e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753\": rpc error: code = NotFound desc = could not find container \"e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753\": container with ID starting with e76d8b25cfdfeaacd6d58aece658fc48721e3e7217e50416a67c4e7b5b745753 not found: ID does not exist" Apr 24 22:14:52.011193 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:52.011136 2569 scope.go:117] "RemoveContainer" containerID="a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219" Apr 24 22:14:52.011387 ip-10-0-133-73 kubenswrapper[2569]: E0424 22:14:52.011366 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219\": container with ID starting with a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219 not found: ID does not exist" containerID="a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219" Apr 24 22:14:52.011452 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:52.011395 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219"} err="failed to get container status \"a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219\": rpc error: code = NotFound desc = could not find container \"a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219\": container with ID starting with a5c24ee1c9fa46114e5e8bfb5d53fade9c76e3f40dc0964b58790ff0cd6e0219 not found: ID does not exist" Apr 24 22:14:53.012873 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:53.012836 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-cn95g_aaedb57f-ea16-416f-9886-ed7e3462f547/manager/0.log" Apr 24 22:14:53.040929 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:53.040892 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-b9b2h_38cd02a3-c2aa-46cd-b870-a44a5cd71fe9/manager/0.log" Apr 24 22:14:53.168859 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:53.168827 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-nw8j6_04c606e3-591e-4913-844a-50bbb025eaad/limitador/0.log" Apr 24 22:14:53.218271 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:53.218237 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-499d9_16fb4c1f-74d0-4dd7-a780-5de816a3d86d/manager/0.log" Apr 24 22:14:53.337409 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:53.337310 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" path="/var/lib/kubelet/pods/f33db989-5c93-455c-b1e8-e18720c7b480/volumes" Apr 24 22:14:54.771719 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:54.771684 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmjqv_8abf106c-e808-4363-b918-dad844446b34/node-exporter/0.log" Apr 24 22:14:54.799042 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:54.798981 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmjqv_8abf106c-e808-4363-b918-dad844446b34/kube-rbac-proxy/0.log" Apr 24 22:14:54.824634 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:54.824600 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmjqv_8abf106c-e808-4363-b918-dad844446b34/init-textfile/0.log" Apr 24 22:14:55.237099 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:55.236990 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-wqj4h_ffd7379c-7840-42d9-9eaa-f4fdd5edf13c/prometheus-operator-admission-webhook/0.log" Apr 24 22:14:57.764700 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:57.764645 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-bwk94_5bcc1f5d-07ac-458a-bbdd-0e8a5bbc5f5b/download-server/0.log" Apr 24 22:14:58.080390 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.080302 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5"] Apr 24 22:14:58.080895 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.080866 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" containerName="copy" Apr 24 22:14:58.080895 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.080893 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" containerName="copy" Apr 24 22:14:58.081140 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.080929 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" containerName="gather" Apr 24 22:14:58.081140 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.080937 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" containerName="gather" Apr 24 22:14:58.081140 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.081027 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" containerName="copy" Apr 24 22:14:58.081140 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.081042 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f33db989-5c93-455c-b1e8-e18720c7b480" containerName="gather" Apr 24 22:14:58.086190 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.086154 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.091839 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.091811 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5"] Apr 24 22:14:58.158821 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.158789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-proc\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.158988 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.158848 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9jxk\" (UniqueName: \"kubernetes.io/projected/26d4d57b-419a-4d3d-b250-d896619d7fa6-kube-api-access-n9jxk\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.158988 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.158873 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-podres\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.158988 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.158895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-lib-modules\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.158988 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.158928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-sys\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260293 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9jxk\" (UniqueName: \"kubernetes.io/projected/26d4d57b-419a-4d3d-b250-d896619d7fa6-kube-api-access-n9jxk\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260477 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-podres\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260477 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-lib-modules\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260477 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260389 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-sys\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260477 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-proc\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260715 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260536 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-sys\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260715 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260545 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-proc\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260715 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-lib-modules\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.260715 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.260590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/26d4d57b-419a-4d3d-b250-d896619d7fa6-podres\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.270280 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.270245 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9jxk\" (UniqueName: \"kubernetes.io/projected/26d4d57b-419a-4d3d-b250-d896619d7fa6-kube-api-access-n9jxk\") pod \"perf-node-gather-daemonset-mb7l5\" (UID: \"26d4d57b-419a-4d3d-b250-d896619d7fa6\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.401999 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.401915 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:58.549156 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:58.549107 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5"] Apr 24 22:14:59.008766 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.008726 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" event={"ID":"26d4d57b-419a-4d3d-b250-d896619d7fa6","Type":"ContainerStarted","Data":"9fd9a6cfab6a820f42796b46a4fe7c60253c99d45bfceb91da331fff5a7f3994"} Apr 24 22:14:59.008766 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.008764 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" event={"ID":"26d4d57b-419a-4d3d-b250-d896619d7fa6","Type":"ContainerStarted","Data":"c7aa4b4cc436a4b90f60ff9c2dead2fa34d827a5752f83984715c863b381e402"} Apr 24 22:14:59.009294 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.008811 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:14:59.010514 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.010493 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h6bcf_74d17f36-1527-485c-ad29-abd7e0f42a70/dns/0.log" Apr 24 22:14:59.026116 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.026079 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" podStartSLOduration=1.026066115 podStartE2EDuration="1.026066115s" podCreationTimestamp="2026-04-24 22:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:14:59.025273234 +0000 UTC m=+2910.265626496" watchObservedRunningTime="2026-04-24 22:14:59.026066115 +0000 UTC m=+2910.266419376" Apr 24 22:14:59.035921 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.035895 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h6bcf_74d17f36-1527-485c-ad29-abd7e0f42a70/kube-rbac-proxy/0.log" Apr 24 22:14:59.156907 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.156877 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hkcmr_3a5a0d8d-bcc0-4444-bbd8-a88bae8670d8/dns-node-resolver/0.log" Apr 24 22:14:59.676479 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.676433 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7859dbcb8d-bfkg5_12bfbd31-9b7c-48ae-b6fd-469c93decd67/registry/0.log" Apr 24 22:14:59.699560 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:14:59.699522 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9pchk_907a584d-065c-4c42-a7ff-3db1f2519bf9/node-ca/0.log" Apr 24 22:15:00.591642 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:00.591566 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-5wn4v_aa08ecbd-1f58-4d2c-a231-3db5090bb227/istio-proxy/0.log" Apr 24 22:15:01.091596 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:01.091558 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4ph6t_867d54ac-7685-453e-ab98-671a28e06ea0/serve-healthcheck-canary/0.log" Apr 24 22:15:01.779972 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:01.779937 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w28n7_2231f388-e1f5-4ecf-b03c-f5f6c48450c3/kube-rbac-proxy/0.log" Apr 24 22:15:01.801852 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:01.801826 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w28n7_2231f388-e1f5-4ecf-b03c-f5f6c48450c3/exporter/0.log" Apr 24 22:15:01.825780 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:01.825753 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w28n7_2231f388-e1f5-4ecf-b03c-f5f6c48450c3/extractor/0.log" Apr 24 22:15:04.902814 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:04.902783 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84b6647887-hqblb_db338508-f5cc-4372-a3b9-52b695a5cea7/manager/0.log" Apr 24 22:15:04.967298 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:04.967271 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-8bb7t_634e9e8a-cb8f-4839-b5a5-8704e670513f/server/0.log" Apr 24 22:15:05.023998 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:05.023967 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-mb7l5" Apr 24 22:15:05.372509 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:05.372477 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-q9mzt_c41fe313-d889-48bf-b3a0-148f1175b7e0/manager/0.log" Apr 24 22:15:05.393602 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:05.393577 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-4zmd8_0eea4ff2-f92f-4c59-8632-2fc7cb0ba868/s3-init/0.log" Apr 24 22:15:05.425009 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:05.424981 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-fx2vc_dfb56860-3914-407b-874f-fced08269626/seaweedfs/0.log" Apr 24 22:15:10.094042 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:10.094009 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kk8l9_59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c/migrator/0.log" Apr 24 22:15:10.124594 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:10.124562 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-kk8l9_59df3ff1-21a2-4ad6-aea0-e7c4cd6aee2c/graceful-termination/0.log" Apr 24 22:15:11.501536 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:11.501505 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l64cf_a34653fe-8931-4a96-adb4-518f9c93a246/kube-multus-additional-cni-plugins/0.log" Apr 24 22:15:11.525221 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:11.525185 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l64cf_a34653fe-8931-4a96-adb4-518f9c93a246/egress-router-binary-copy/0.log" Apr 24 22:15:11.549823 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:11.549792 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l64cf_a34653fe-8931-4a96-adb4-518f9c93a246/cni-plugins/0.log" Apr 24 22:15:11.576303 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:11.576273 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l64cf_a34653fe-8931-4a96-adb4-518f9c93a246/bond-cni-plugin/0.log" Apr 24 22:15:11.599732 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:11.599697 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l64cf_a34653fe-8931-4a96-adb4-518f9c93a246/routeoverride-cni/0.log" Apr 24 22:15:11.625952 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:11.625927 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l64cf_a34653fe-8931-4a96-adb4-518f9c93a246/whereabouts-cni-bincopy/0.log" Apr 24 22:15:11.648707 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:11.648656 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l64cf_a34653fe-8931-4a96-adb4-518f9c93a246/whereabouts-cni/0.log" Apr 24 22:15:12.048857 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:12.048801 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csnv2_d47b5659-b736-4e8d-abe4-3cee234ead85/kube-multus/0.log" Apr 24 22:15:12.154118 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:12.154082 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7dwzn_7ac4e3b6-523e-4f41-98be-ceb879813ac3/network-metrics-daemon/0.log" Apr 24 22:15:12.178079 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:12.178045 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7dwzn_7ac4e3b6-523e-4f41-98be-ceb879813ac3/kube-rbac-proxy/0.log" Apr 24 22:15:13.026654 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.026616 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-controller/0.log" Apr 24 22:15:13.044897 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.044868 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/0.log" Apr 24 22:15:13.070796 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.070764 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovn-acl-logging/1.log" Apr 24 22:15:13.094742 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.094713 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/kube-rbac-proxy-node/0.log" Apr 24 22:15:13.118289 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.118254 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:15:13.137120 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.137090 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/northd/0.log" Apr 24 22:15:13.160178 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.160149 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/nbdb/0.log" Apr 24 22:15:13.185509 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.185472 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/sbdb/0.log" Apr 24 22:15:13.397252 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:13.397154 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jgb8f_f94d99f1-b1b1-4885-b23c-789c312e3426/ovnkube-controller/0.log" Apr 24 22:15:15.053283 ip-10-0-133-73 kubenswrapper[2569]: I0424 22:15:15.053247 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bg9mt_562ff80c-46f9-46ea-bfc9-cacccd0662db/network-check-target-container/0.log"