Apr 16 17:40:26.497457 ip-10-0-143-216 systemd[1]: Starting Kubernetes Kubelet... Apr 16 17:40:26.966596 ip-10-0-143-216 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:26.966596 ip-10-0-143-216 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 17:40:26.966596 ip-10-0-143-216 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:26.966596 ip-10-0-143-216 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 17:40:26.966596 ip-10-0-143-216 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:26.969217 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.969114 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 17:40:26.971469 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971453 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:26.971469 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971469 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971473 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971476 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971479 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971483 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971486 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971489 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971492 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971495 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971498 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971501 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971504 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971509 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971513 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971516 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971519 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971522 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971524 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971527 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:26.971538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971530 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971533 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971536 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971539 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971542 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971545 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971548 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971551 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971554 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971556 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971559 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971562 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971564 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971567 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971569 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971572 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971575 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971578 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971580 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971582 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:26.971983 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971585 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971587 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971596 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971598 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971601 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971603 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971606 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971608 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971610 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971613 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971615 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971618 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971620 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971623 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971626 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971629 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971631 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971634 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971636 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971639 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:26.972485 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971641 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971644 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971647 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971649 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971653 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971656 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971659 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971662 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971665 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971668 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971670 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971673 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971675 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971678 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971680 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971682 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971685 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971688 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971690 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971694 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:26.972963 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971697 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971699 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971702 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971705 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971707 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.971710 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972120 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972124 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972128 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972130 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972133 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972135 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972138 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972141 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972143 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972146 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972150 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972168 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972171 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972174 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:26.973465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972176 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972179 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972181 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972185 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972190 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972192 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972195 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972198 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972200 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972203 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972206 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972208 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972211 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972213 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972216 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972219 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972221 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972224 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972227 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972230 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:26.973951 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972233 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972237 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972242 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972245 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972248 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972251 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972254 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972256 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972259 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972270 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972272 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972275 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972278 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972280 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972283 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972285 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972288 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972290 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972293 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:26.974466 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972295 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972298 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972300 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972303 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972307 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972309 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972312 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972314 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972317 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972320 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972323 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972325 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972328 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972330 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972333 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972336 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972338 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972341 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972343 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972346 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:26.974941 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972348 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972352 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972354 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972357 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972360 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972362 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972365 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972367 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972371 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972373 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972376 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972378 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.972381 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973098 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973108 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973115 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973119 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973124 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973128 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973132 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973137 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 17:40:26.975488 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973140 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973143 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973147 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973151 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973168 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973171 2580 flags.go:64] FLAG: --cgroup-root="" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973174 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973177 2580 flags.go:64] FLAG: --client-ca-file="" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973180 2580 flags.go:64] FLAG: --cloud-config="" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973183 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973186 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973190 2580 flags.go:64] FLAG: --cluster-domain="" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973193 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973196 2580 flags.go:64] FLAG: --config-dir="" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973199 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973203 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973207 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973210 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973214 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973217 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973220 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973223 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973226 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973229 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973232 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 17:40:26.975985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973237 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973240 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973243 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973246 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973249 2580 flags.go:64] FLAG: --enable-server="true" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973252 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973258 2580 flags.go:64] FLAG: --event-burst="100" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973261 2580 flags.go:64] FLAG: --event-qps="50" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973264 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973267 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973270 2580 flags.go:64] FLAG: --eviction-hard="" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973274 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973277 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973280 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973283 2580 flags.go:64] FLAG: --eviction-soft="" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973286 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973289 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973292 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973295 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973298 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973301 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973304 2580 flags.go:64] FLAG: --feature-gates="" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973307 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973311 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973314 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 17:40:26.976585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973318 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973321 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973324 2580 flags.go:64] FLAG: --help="false" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973327 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-143-216.ec2.internal" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973330 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973333 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973336 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973339 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973343 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973345 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973348 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973351 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973354 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973357 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973360 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973363 2580 flags.go:64] FLAG: --kube-reserved="" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973366 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973369 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973372 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973374 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973377 2580 flags.go:64] FLAG: --lock-file="" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973380 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973383 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973387 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 17:40:26.977235 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973392 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973395 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973398 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973401 2580 flags.go:64] FLAG: --logging-format="text" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973404 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973407 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973410 2580 flags.go:64] FLAG: --manifest-url="" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973413 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973420 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973423 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973427 2580 flags.go:64] FLAG: --max-pods="110" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973430 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973433 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973436 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973439 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973442 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973444 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973447 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973456 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973462 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973465 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973468 2580 flags.go:64] FLAG: --pod-cidr="" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973471 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 17:40:26.977828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973477 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973480 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973483 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973486 2580 flags.go:64] FLAG: --port="10250" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973489 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973492 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0dd90ee6514caedc3" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973495 2580 flags.go:64] FLAG: --qos-reserved="" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973498 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973501 2580 flags.go:64] FLAG: --register-node="true" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973504 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973507 2580 flags.go:64] FLAG: --register-with-taints="" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973511 2580 flags.go:64] FLAG: --registry-burst="10" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973514 2580 flags.go:64] FLAG: --registry-qps="5" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973516 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973519 2580 flags.go:64] FLAG: --reserved-memory="" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973523 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973525 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973530 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973533 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973536 2580 flags.go:64] FLAG: --runonce="false" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973538 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973541 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973544 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973547 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973550 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973553 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 17:40:26.978433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973556 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973560 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973563 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973567 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973570 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973572 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973576 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973578 2580 flags.go:64] FLAG: --system-cgroups="" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973581 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973587 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973590 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973593 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973597 2580 flags.go:64] FLAG: --tls-min-version="" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973599 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973602 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973605 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973609 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973612 2580 flags.go:64] FLAG: --v="2" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973616 2580 flags.go:64] FLAG: --version="false" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973620 2580 flags.go:64] FLAG: --vmodule="" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973624 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.973628 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973730 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973734 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:26.979044 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973738 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973741 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973744 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973746 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973749 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973752 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973754 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973757 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973759 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973761 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973764 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973768 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973770 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973773 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973775 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973778 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973780 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973783 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973786 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973788 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:26.979659 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973791 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973794 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973796 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973799 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973801 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973804 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973806 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973809 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973811 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973814 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973817 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973819 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973822 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973825 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973828 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973830 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973833 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973836 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973838 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973841 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:26.980169 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973843 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973846 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973849 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973852 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973856 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973860 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973863 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973866 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973869 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973872 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973875 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973880 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973883 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973885 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973888 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973891 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973893 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973896 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973898 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:26.980662 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973901 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973903 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973906 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973908 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973911 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973914 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973916 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973919 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973921 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973924 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973927 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973929 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973932 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973934 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973936 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973939 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973942 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973945 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973947 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973950 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:26.981116 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973953 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:26.981613 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973955 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:26.981613 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973958 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:26.981613 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973961 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:26.981613 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.973963 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:26.981613 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.974558 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:26.982055 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.982034 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 17:40:26.982088 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.982056 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 17:40:26.983312 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983282 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:26.983312 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983307 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:26.983312 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983311 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:26.983522 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983346 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:26.983522 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983373 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:26.983522 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983381 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:26.983522 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983390 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:26.983522 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983406 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:26.983522 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983411 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:26.983522 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983416 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:26.983522 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983420 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983584 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983592 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983595 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983598 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983602 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983605 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983608 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983611 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983614 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983617 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983619 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983623 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983625 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983628 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983631 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983634 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983637 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983639 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983642 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:26.983826 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983645 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983647 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983650 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983653 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983655 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983657 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983660 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983664 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983666 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983671 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983675 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983679 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983682 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983685 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983689 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983692 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983694 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983697 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983699 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:26.984440 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983702 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983704 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983707 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983710 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983712 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983715 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983717 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983720 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983722 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983725 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983727 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983730 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983732 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983734 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983737 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983739 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983742 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983744 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983747 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983750 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:26.984927 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983753 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983755 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983758 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983760 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983763 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983766 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983769 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983773 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983777 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983780 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983783 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983786 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983788 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983791 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983794 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983796 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:26.985425 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.983799 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.983804 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984649 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984655 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984659 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984662 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984665 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984667 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984670 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984673 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984675 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984678 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984681 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984684 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984686 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984689 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:26.985824 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984691 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984694 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984696 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984699 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984702 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984704 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984708 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984712 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984715 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984717 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984720 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984723 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984726 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984728 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984731 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984733 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984736 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984738 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984741 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:26.986230 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984743 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984746 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984748 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984750 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984753 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984755 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984758 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984760 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984762 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984765 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984767 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984770 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984772 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984775 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984777 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984780 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984782 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984784 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984787 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984790 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:26.986710 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984793 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984795 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984798 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984800 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984807 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984810 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984813 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984816 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984819 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984821 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984824 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984826 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984829 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984832 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984835 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984839 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984842 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984844 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984847 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:26.987209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984850 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984853 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984856 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984858 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984861 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984864 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984867 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984869 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984872 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984874 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984877 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984879 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984881 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:26.984885 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.984889 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:26.987672 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.985812 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 17:40:26.992986 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.992971 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 17:40:26.994047 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.994035 2580 server.go:1019] "Starting client certificate rotation" Apr 16 17:40:26.994172 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.994140 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:40:26.994240 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:26.994207 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:40:27.022132 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.022101 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:40:27.024747 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.024720 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:40:27.037047 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.037024 2580 log.go:25] "Validated CRI v1 runtime API" Apr 16 17:40:27.042802 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.042782 2580 log.go:25] "Validated CRI v1 image API" Apr 16 17:40:27.047303 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.047279 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 17:40:27.053175 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.053130 2580 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9a96b7e7-f41a-405b-9603-6bc0085c1051:/dev/nvme0n1p3 bb9f81cd-096a-451a-b4c6-e1ca51ac88f7:/dev/nvme0n1p4] Apr 16 17:40:27.053255 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.053175 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 17:40:27.060195 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.060050 2580 manager.go:217] Machine: {Timestamp:2026-04-16 17:40:27.058119161 +0000 UTC m=+0.432293581 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3157935 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24b7bef953e957686f9d7ca4a4097c SystemUUID:ec24b7be-f953-e957-686f-9d7ca4a4097c BootID:1e5c589a-93a4-425c-9a44-1906053b8006 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:eb:ae:67:30:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:eb:ae:67:30:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:06:f3:f8:fc:31 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 17:40:27.060195 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.060188 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 17:40:27.060346 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.060278 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 17:40:27.061420 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.061396 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 17:40:27.061559 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.061423 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-216.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 17:40:27.061607 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.061568 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 17:40:27.061607 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.061579 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 17:40:27.061607 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.061591 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:40:27.062671 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.062659 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:40:27.064142 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.064132 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:40:27.064203 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.064147 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:40:27.064459 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.064448 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 17:40:27.067920 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.067907 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 16 17:40:27.067972 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.067925 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 17:40:27.067972 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.067942 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 17:40:27.067972 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.067952 2580 kubelet.go:397] "Adding apiserver pod source" Apr 16 17:40:27.067972 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.067961 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 17:40:27.069214 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.069202 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:40:27.069286 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.069221 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:40:27.072150 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.072133 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 17:40:27.074277 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.074263 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 17:40:27.075634 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075621 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 17:40:27.075690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075639 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 17:40:27.075690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075646 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 17:40:27.075690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075652 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 17:40:27.075690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075660 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 17:40:27.075690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075669 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 17:40:27.075690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075677 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 17:40:27.075690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075682 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 17:40:27.075690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075690 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 17:40:27.075899 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075710 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 17:40:27.075899 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075727 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 17:40:27.075899 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.075736 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 17:40:27.076804 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.076794 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 17:40:27.076804 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.076804 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 17:40:27.080908 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.080893 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 17:40:27.080991 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.080961 2580 server.go:1295] "Started kubelet" Apr 16 17:40:27.081056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.081030 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 17:40:27.081185 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.081128 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 17:40:27.081224 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.081208 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 17:40:27.081980 ip-10-0-143-216 systemd[1]: Started Kubernetes Kubelet. Apr 16 17:40:27.082674 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.082653 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 16 17:40:27.082963 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.082944 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 17:40:27.084790 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.084761 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 17:40:27.084886 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.084869 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-216.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 17:40:27.084984 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.084960 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-216.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 17:40:27.088591 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.088562 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 17:40:27.089237 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.089219 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 17:40:27.089610 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.089585 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-stcbl" Apr 16 17:40:27.090793 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.090771 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.090919 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.090877 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 17:40:27.091132 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.091100 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 17:40:27.091239 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.091136 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 17:40:27.092226 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.091348 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 16 17:40:27.092226 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.091362 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 16 17:40:27.092226 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.090492 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-216.ec2.internal.18a6e71fdcef3852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-216.ec2.internal,UID:ip-10-0-143-216.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-216.ec2.internal,},FirstTimestamp:2026-04-16 17:40:27.080906834 +0000 UTC m=+0.455081253,LastTimestamp:2026-04-16 17:40:27.080906834 +0000 UTC m=+0.455081253,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-216.ec2.internal,}" Apr 16 17:40:27.093101 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.092801 2580 factory.go:55] Registering systemd factory Apr 16 17:40:27.093101 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.092820 2580 factory.go:223] Registration of the systemd container factory successfully Apr 16 17:40:27.093405 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.093381 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 17:40:27.093638 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.093604 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-216.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 17:40:27.095315 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.095297 2580 factory.go:153] Registering CRI-O factory Apr 16 17:40:27.095315 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.095316 2580 factory.go:223] Registration of the crio container factory successfully Apr 16 17:40:27.095442 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.095377 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 17:40:27.095442 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.095401 2580 factory.go:103] Registering Raw factory Apr 16 17:40:27.095442 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.095417 2580 manager.go:1196] Started watching for new ooms in manager Apr 16 17:40:27.095778 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.095751 2580 manager.go:319] Starting recovery of all containers Apr 16 17:40:27.100566 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.100523 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 17:40:27.103232 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.102952 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-stcbl" Apr 16 17:40:27.107868 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.107851 2580 manager.go:324] Recovery completed Apr 16 17:40:27.112090 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.112077 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:27.115634 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.115613 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:27.115734 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.115646 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:27.115734 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.115661 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:27.116229 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.116215 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 17:40:27.116229 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.116226 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 17:40:27.116315 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.116243 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:40:27.118182 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.118093 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-216.ec2.internal.18a6e71fdf010de5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-216.ec2.internal,UID:ip-10-0-143-216.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-216.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-216.ec2.internal,},FirstTimestamp:2026-04-16 17:40:27.115630053 +0000 UTC m=+0.489804475,LastTimestamp:2026-04-16 17:40:27.115630053 +0000 UTC m=+0.489804475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-216.ec2.internal,}" Apr 16 17:40:27.118371 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.118358 2580 policy_none.go:49] "None policy: Start" Apr 16 17:40:27.118424 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.118374 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 17:40:27.118424 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.118396 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.160771 2580 manager.go:341] "Starting Device Plugin manager" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.160821 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.160831 2580 server.go:85] "Starting device plugin registration server" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.161115 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.161130 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.161273 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.161350 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.161358 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.161861 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 17:40:27.168340 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.161893 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.256676 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.256594 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 17:40:27.257958 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.257942 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 17:40:27.258018 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.257970 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 17:40:27.258018 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.257989 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 17:40:27.258018 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.257997 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 17:40:27.258125 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.258034 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 17:40:27.260592 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.260569 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:27.261293 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.261274 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:27.262471 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.262453 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:27.262567 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.262484 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:27.262567 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.262498 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:27.262567 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.262525 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.270712 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.270697 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.270813 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.270718 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-216.ec2.internal\": node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.289574 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.289548 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.358472 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.358411 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal"] Apr 16 17:40:27.358584 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.358524 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:27.359613 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.359592 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:27.359719 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.359622 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:27.359719 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.359631 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:27.362023 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362012 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:27.362168 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362144 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.362206 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362188 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:27.362752 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362738 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:27.362796 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362761 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:27.362796 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362770 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:27.362863 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362799 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:27.362863 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362818 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:27.362863 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.362833 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:27.365432 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.365417 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.365474 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.365445 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:27.366131 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.366115 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:27.366210 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.366144 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:27.366210 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.366168 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:27.390468 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.390442 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.393864 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.393801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cbf533753c31efbf1e7c032d253d1486-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal\" (UID: \"cbf533753c31efbf1e7c032d253d1486\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.393973 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.393870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbf533753c31efbf1e7c032d253d1486-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal\" (UID: \"cbf533753c31efbf1e7c032d253d1486\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.393973 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.393899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d0025abf26159e90c9c59e296cbe6be-config\") pod \"kube-apiserver-proxy-ip-10-0-143-216.ec2.internal\" (UID: \"3d0025abf26159e90c9c59e296cbe6be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.393973 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.393929 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-216.ec2.internal\" not found" node="ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.398454 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.398435 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-216.ec2.internal\" not found" node="ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.490850 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.490811 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.494061 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.494043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d0025abf26159e90c9c59e296cbe6be-config\") pod \"kube-apiserver-proxy-ip-10-0-143-216.ec2.internal\" (UID: \"3d0025abf26159e90c9c59e296cbe6be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.494175 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.494075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cbf533753c31efbf1e7c032d253d1486-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal\" (UID: \"cbf533753c31efbf1e7c032d253d1486\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.494175 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.494106 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbf533753c31efbf1e7c032d253d1486-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal\" (UID: \"cbf533753c31efbf1e7c032d253d1486\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.494175 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.494143 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3d0025abf26159e90c9c59e296cbe6be-config\") pod \"kube-apiserver-proxy-ip-10-0-143-216.ec2.internal\" (UID: \"3d0025abf26159e90c9c59e296cbe6be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.494303 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.494177 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cbf533753c31efbf1e7c032d253d1486-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal\" (UID: \"cbf533753c31efbf1e7c032d253d1486\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.494303 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.494186 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbf533753c31efbf1e7c032d253d1486-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal\" (UID: \"cbf533753c31efbf1e7c032d253d1486\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.591592 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.591497 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.692066 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.692040 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.697361 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.697345 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.700905 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.700887 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" Apr 16 17:40:27.792983 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.792957 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.893611 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.893532 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.994213 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:27.994148 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:27.994213 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.994191 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 17:40:27.994892 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:27.994356 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:40:28.063385 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.063188 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:28.088989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.088959 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 17:40:28.094941 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:28.094919 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:28.105583 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.105541 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:35:27 +0000 UTC" deadline="2028-01-24 00:07:38.50591434 +0000 UTC" Apr 16 17:40:28.105583 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.105580 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15534h27m10.400336578s" Apr 16 17:40:28.105715 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.105647 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:40:28.127199 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.127152 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zkdqt" Apr 16 17:40:28.136196 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.136173 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zkdqt" Apr 16 17:40:28.195628 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:28.195540 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:28.296104 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:28.296066 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:28.332276 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:28.332237 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbf533753c31efbf1e7c032d253d1486.slice/crio-ca15150433b11878c64f1a1e5761ec7cb2e330d84b000f51927280da243801a6 WatchSource:0}: Error finding container ca15150433b11878c64f1a1e5761ec7cb2e330d84b000f51927280da243801a6: Status 404 returned error can't find the container with id ca15150433b11878c64f1a1e5761ec7cb2e330d84b000f51927280da243801a6 Apr 16 17:40:28.332538 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:28.332516 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d0025abf26159e90c9c59e296cbe6be.slice/crio-4bcb3e750b2d43a917c2b02b683c32868535ef2fc59114ae6532882643bab93e WatchSource:0}: Error finding container 4bcb3e750b2d43a917c2b02b683c32868535ef2fc59114ae6532882643bab93e: Status 404 returned error can't find the container with id 4bcb3e750b2d43a917c2b02b683c32868535ef2fc59114ae6532882643bab93e Apr 16 17:40:28.337550 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.337534 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:40:28.396634 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:28.396593 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-216.ec2.internal\" not found" Apr 16 17:40:28.433228 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.433200 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:28.449087 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.449020 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:28.489881 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.489852 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" Apr 16 17:40:28.499121 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.499092 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:40:28.500094 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.500082 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" Apr 16 17:40:28.517137 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:28.517119 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:40:29.069102 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.068852 2580 apiserver.go:52] "Watching apiserver" Apr 16 17:40:29.074702 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.074669 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 17:40:29.076948 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.076864 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-78497","kube-system/konnectivity-agent-qkj9z","kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q","openshift-cluster-node-tuning-operator/tuned-zdhfq","openshift-multus/multus-additional-cni-plugins-blkts","openshift-multus/multus-w72d2","openshift-network-diagnostics/network-check-target-hnxw4","openshift-ovn-kubernetes/ovnkube-node-5r9pl","openshift-dns/node-resolver-qnz92","openshift-image-registry/node-ca-mxk78","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal","openshift-multus/network-metrics-daemon-2st9k"] Apr 16 17:40:29.083314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.082512 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:29.085054 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.085029 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.085194 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.085088 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-h9tm8\"" Apr 16 17:40:29.086237 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.085693 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 17:40:29.086237 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.085836 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 17:40:29.086788 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.086766 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 17:40:29.087290 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.087264 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mnbq7\"" Apr 16 17:40:29.087542 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.087524 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 17:40:29.087751 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.087734 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 17:40:29.087830 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.087757 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.089459 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.089437 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5v4bz\"" Apr 16 17:40:29.089710 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.089693 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:29.089857 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.089843 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:29.090389 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.090098 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.090389 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.090249 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.092111 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.092090 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 17:40:29.092308 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.092291 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xlkxb\"" Apr 16 17:40:29.092387 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.092338 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 17:40:29.092497 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.092480 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hgp2z\"" Apr 16 17:40:29.092979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.092502 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 17:40:29.092979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.092642 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 17:40:29.092979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.092786 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 17:40:29.092979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.092793 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 17:40:29.095136 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.095047 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.102535 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.102512 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:29.102765 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.102514 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7srdm\"" Apr 16 17:40:29.103048 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103030 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-etc-selinux\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.103139 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103061 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-cni-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.103139 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103080 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-cni-multus\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.103139 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103101 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg8jp\" (UniqueName: \"kubernetes.io/projected/21cfe301-09d7-4af8-8050-a3969d8eb2db-kube-api-access-tg8jp\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.103139 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103130 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/872f1cbb-5c2b-4f0b-a6f4-1d4693e07813-agent-certs\") pod \"konnectivity-agent-qkj9z\" (UID: \"872f1cbb-5c2b-4f0b-a6f4-1d4693e07813\") " pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:29.103357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103148 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/872f1cbb-5c2b-4f0b-a6f4-1d4693e07813-konnectivity-ca\") pod \"konnectivity-agent-qkj9z\" (UID: \"872f1cbb-5c2b-4f0b-a6f4-1d4693e07813\") " pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:29.103357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysctl-d\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.103357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103216 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-run\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.103357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103233 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-var-lib-kubelet\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.103357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103259 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-os-release\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.103357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103292 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c2461943-ed32-4619-8936-d5df421841b4-iptables-alerter-script\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.103357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103341 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2461943-ed32-4619-8936-d5df421841b4-host-slash\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.103677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103428 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhpw\" (UniqueName: \"kubernetes.io/projected/d75665a5-d922-4bab-a0cb-028ff1c31c7a-kube-api-access-gfhpw\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.103677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103481 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysconfig\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.103677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103531 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-cnibin\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.103677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103555 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:29.103677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103591 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.103677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-os-release\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.103677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103646 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 17:40:29.103677 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.103648 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103687 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-cni-bin\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103723 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-kubelet\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103752 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-multus-certs\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103784 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-tmp\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103877 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-registration-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103907 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-systemd\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103939 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-tuned\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103971 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.103977 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-system-cni-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104039 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104018 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-etc-kubernetes\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104056 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-socket-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104095 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-device-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104127 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-modprobe-d\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-system-cni-dir\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104269 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnfp9\" (UniqueName: \"kubernetes.io/projected/c2461943-ed32-4619-8936-d5df421841b4-kube-api-access-xnfp9\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-socket-dir-parent\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104336 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-netns\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104367 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-conf-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-sys\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104428 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-cni-binary-copy\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104481 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-cnibin\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104593 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21cfe301-09d7-4af8-8050-a3969d8eb2db-cni-binary-copy\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104613 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-daemon-config\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104631 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-host\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104650 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104667 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-k8s-cni-cncf-io\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104685 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd4kh\" (UniqueName: \"kubernetes.io/projected/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-kube-api-access-bd4kh\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104706 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-sys-fs\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104724 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-kubernetes\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104743 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysctl-conf\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-lib-modules\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104789 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk6rn\" (UniqueName: \"kubernetes.io/projected/434e8414-887d-4565-9d3b-620183c5537a-kube-api-access-vk6rn\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104830 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-hostroot\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.104970 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.104859 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.106530 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.106511 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.109215 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.109197 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 17:40:29.109445 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.109424 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jcfh8\"" Apr 16 17:40:29.109525 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.109432 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 17:40:29.111602 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.111581 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.113327 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.113308 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 17:40:29.113756 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.113738 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q85p4\"" Apr 16 17:40:29.113968 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.113951 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 17:40:29.114217 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.114196 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 17:40:29.114295 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.114270 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:29.114346 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.114327 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:29.114455 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.114439 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.116545 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.116523 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 17:40:29.116767 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.116574 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 17:40:29.116767 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.116728 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 17:40:29.117341 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.117319 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tknzl\"" Apr 16 17:40:29.117434 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.117382 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 17:40:29.117490 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.117325 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 17:40:29.118330 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.118312 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 17:40:29.137956 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.137925 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:35:28 +0000 UTC" deadline="2027-09-17 17:13:44.444967502 +0000 UTC" Apr 16 17:40:29.138096 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.138082 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12455h33m15.3068923s" Apr 16 17:40:29.192109 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.192078 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 17:40:29.205078 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-k8s-cni-cncf-io\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.205252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205093 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-env-overrides\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.205252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205121 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-hosts-file\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.205252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205149 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd4kh\" (UniqueName: \"kubernetes.io/projected/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-kube-api-access-bd4kh\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.205252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysctl-conf\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.205252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205218 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-lib-modules\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.205252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205249 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/872f1cbb-5c2b-4f0b-a6f4-1d4693e07813-agent-certs\") pod \"konnectivity-agent-qkj9z\" (UID: \"872f1cbb-5c2b-4f0b-a6f4-1d4693e07813\") " pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205268 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/872f1cbb-5c2b-4f0b-a6f4-1d4693e07813-konnectivity-ca\") pod \"konnectivity-agent-qkj9z\" (UID: \"872f1cbb-5c2b-4f0b-a6f4-1d4693e07813\") " pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205284 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-cni-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg8jp\" (UniqueName: \"kubernetes.io/projected/21cfe301-09d7-4af8-8050-a3969d8eb2db-kube-api-access-tg8jp\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205328 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tkw5\" (UniqueName: \"kubernetes.io/projected/75516b17-54a7-403c-b9a7-20ae8a32ebb7-kube-api-access-2tkw5\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205352 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovnkube-config\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205370 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-run\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205385 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-os-release\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205399 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c2461943-ed32-4619-8936-d5df421841b4-iptables-alerter-script\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2461943-ed32-4619-8936-d5df421841b4-host-slash\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205439 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-os-release\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.205476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205460 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-cni-bin\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205481 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysconfig\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205505 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-cnibin\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205526 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205541 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-kubelet\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205557 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-multus-certs\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205574 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-kubelet\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205589 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-slash\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205604 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-tmp\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-tuned\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-system-cni-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205670 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxxqg\" (UniqueName: \"kubernetes.io/projected/4a975d1a-4be7-41a5-b7fd-95561bba816e-kube-api-access-vxxqg\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205719 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfhp\" (UniqueName: \"kubernetes.io/projected/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-kube-api-access-9dfhp\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205737 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-device-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205753 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-modprobe-d\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205769 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-system-cni-dir\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.205795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-conf-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a975d1a-4be7-41a5-b7fd-95561bba816e-serviceca\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205832 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-systemd-units\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205851 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-sys\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205873 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205932 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snxpl\" (UniqueName: \"kubernetes.io/projected/a343c3d6-4d4e-4e8e-8658-c59308d09601-kube-api-access-snxpl\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205950 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-host\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205965 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.205982 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-cni-bin\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206002 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovn-node-metrics-cert\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206025 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-sys-fs\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206042 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-kubernetes\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206077 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk6rn\" (UniqueName: \"kubernetes.io/projected/434e8414-887d-4565-9d3b-620183c5537a-kube-api-access-vk6rn\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206101 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-hostroot\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-systemd\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206131 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-log-socket\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.206370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206147 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovnkube-script-lib\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206177 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206199 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-etc-selinux\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206223 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-cni-multus\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206348 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206475 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-os-release\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206512 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-etc-selinux\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206529 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-run\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206561 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206565 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-cni-multus\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206595 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-k8s-cni-cncf-io\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206606 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-cni-bin\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206627 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2461943-ed32-4619-8936-d5df421841b4-host-slash\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysconfig\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206664 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-device-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206514 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-modprobe-d\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206689 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-cnibin\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206795 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 17:40:29.207059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206860 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-multus-certs\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206902 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-var-lib-kubelet\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206954 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-system-cni-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.206988 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-system-cni-dir\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207023 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-conf-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-sys\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207389 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-host\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207388 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207390 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysctl-conf\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207489 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-lib-modules\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207544 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-cni-dir\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207588 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-sys-fs\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207625 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-node-log\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysctl-d\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207687 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-var-lib-kubelet\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207721 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a975d1a-4be7-41a5-b7fd-95561bba816e-host\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207781 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-ovn\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.207983 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207809 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-run-ovn-kubernetes\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207838 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhpw\" (UniqueName: \"kubernetes.io/projected/d75665a5-d922-4bab-a0cb-028ff1c31c7a-kube-api-access-gfhpw\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-etc-kubernetes\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207870 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207884 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-hostroot\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207932 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-kubernetes\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207935 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-var-lib-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208126 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-registration-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208179 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-systemd\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-cni-netd\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-socket-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208271 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnfp9\" (UniqueName: \"kubernetes.io/projected/c2461943-ed32-4619-8936-d5df421841b4-kube-api-access-xnfp9\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208301 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-socket-dir-parent\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/872f1cbb-5c2b-4f0b-a6f4-1d4693e07813-konnectivity-ca\") pod \"konnectivity-agent-qkj9z\" (UID: \"872f1cbb-5c2b-4f0b-a6f4-1d4693e07813\") " pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208335 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-netns\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-daemon-config\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208388 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-run-netns\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.208744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208419 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-cni-binary-copy\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208478 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-cnibin\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208502 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21cfe301-09d7-4af8-8050-a3969d8eb2db-cni-binary-copy\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-etc-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-tmp-dir\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.207958 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/434e8414-887d-4565-9d3b-620183c5537a-os-release\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208708 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-sysctl-d\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208759 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-var-lib-kubelet\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208859 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c2461943-ed32-4619-8936-d5df421841b4-iptables-alerter-script\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208870 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-etc-kubernetes\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208945 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-host-run-netns\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208955 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-socket-dir-parent\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208964 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21cfe301-09d7-4af8-8050-a3969d8eb2db-cnibin\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.208992 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-registration-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.209040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-systemd\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.209250 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d75665a5-d922-4bab-a0cb-028ff1c31c7a-socket-dir\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.210179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.209482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-cni-binary-copy\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.210838 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.209599 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/434e8414-887d-4565-9d3b-620183c5537a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.210838 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.209668 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21cfe301-09d7-4af8-8050-a3969d8eb2db-multus-daemon-config\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.210838 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.209701 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21cfe301-09d7-4af8-8050-a3969d8eb2db-cni-binary-copy\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.210838 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.210589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-tmp\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.210838 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.210625 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-etc-tuned\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.210838 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.210734 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/872f1cbb-5c2b-4f0b-a6f4-1d4693e07813-agent-certs\") pod \"konnectivity-agent-qkj9z\" (UID: \"872f1cbb-5c2b-4f0b-a6f4-1d4693e07813\") " pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:29.219241 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.218778 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd4kh\" (UniqueName: \"kubernetes.io/projected/feac9d41-8b69-46c4-bae0-8f7642ab0fd9-kube-api-access-bd4kh\") pod \"tuned-zdhfq\" (UID: \"feac9d41-8b69-46c4-bae0-8f7642ab0fd9\") " pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.224385 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.222700 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg8jp\" (UniqueName: \"kubernetes.io/projected/21cfe301-09d7-4af8-8050-a3969d8eb2db-kube-api-access-tg8jp\") pod \"multus-w72d2\" (UID: \"21cfe301-09d7-4af8-8050-a3969d8eb2db\") " pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.224385 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.222893 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhpw\" (UniqueName: \"kubernetes.io/projected/d75665a5-d922-4bab-a0cb-028ff1c31c7a-kube-api-access-gfhpw\") pod \"aws-ebs-csi-driver-node-gg87q\" (UID: \"d75665a5-d922-4bab-a0cb-028ff1c31c7a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.224385 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.223064 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk6rn\" (UniqueName: \"kubernetes.io/projected/434e8414-887d-4565-9d3b-620183c5537a-kube-api-access-vk6rn\") pod \"multus-additional-cni-plugins-blkts\" (UID: \"434e8414-887d-4565-9d3b-620183c5537a\") " pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.224385 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.223071 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnfp9\" (UniqueName: \"kubernetes.io/projected/c2461943-ed32-4619-8936-d5df421841b4-kube-api-access-xnfp9\") pod \"iptables-alerter-78497\" (UID: \"c2461943-ed32-4619-8936-d5df421841b4\") " pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.261995 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.261937 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" event={"ID":"cbf533753c31efbf1e7c032d253d1486","Type":"ContainerStarted","Data":"ca15150433b11878c64f1a1e5761ec7cb2e330d84b000f51927280da243801a6"} Apr 16 17:40:29.262666 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.262640 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" event={"ID":"3d0025abf26159e90c9c59e296cbe6be","Type":"ContainerStarted","Data":"4bcb3e750b2d43a917c2b02b683c32868535ef2fc59114ae6532882643bab93e"} Apr 16 17:40:29.309116 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309076 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309116 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309120 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-node-log\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a975d1a-4be7-41a5-b7fd-95561bba816e-host\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a975d1a-4be7-41a5-b7fd-95561bba816e-host\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309222 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-node-log\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309237 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-ovn\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309265 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-run-ovn-kubernetes\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309278 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-var-lib-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309280 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-ovn\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309300 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-run-ovn-kubernetes\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309315 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-cni-netd\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309349 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-cni-netd\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309350 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-var-lib-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309377 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309386 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-run-netns\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309416 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-etc-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309422 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-run-netns\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309441 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-tmp-dir\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309467 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-env-overrides\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309470 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-etc-openvswitch\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309481 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-hosts-file\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tkw5\" (UniqueName: \"kubernetes.io/projected/75516b17-54a7-403c-b9a7-20ae8a32ebb7-kube-api-access-2tkw5\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309537 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovnkube-config\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309574 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-kubelet\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-slash\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309622 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-hosts-file\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309628 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309651 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxxqg\" (UniqueName: \"kubernetes.io/projected/4a975d1a-4be7-41a5-b7fd-95561bba816e-kube-api-access-vxxqg\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfhp\" (UniqueName: \"kubernetes.io/projected/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-kube-api-access-9dfhp\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a975d1a-4be7-41a5-b7fd-95561bba816e-serviceca\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309705 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-systemd-units\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.309895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309721 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309739 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309763 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snxpl\" (UniqueName: \"kubernetes.io/projected/a343c3d6-4d4e-4e8e-8658-c59308d09601-kube-api-access-snxpl\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309781 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-cni-bin\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovn-node-metrics-cert\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309813 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-systemd\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309837 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-log-socket\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309853 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovnkube-script-lib\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-env-overrides\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.310109 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310135 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovnkube-config\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.309812 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-tmp-dir\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.310212 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs podName:75516b17-54a7-403c-b9a7-20ae8a32ebb7 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:29.810189541 +0000 UTC m=+3.184363970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs") pod "network-metrics-daemon-2st9k" (UID: "75516b17-54a7-403c-b9a7-20ae8a32ebb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310233 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-run-systemd\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310252 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-kubelet\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310281 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-log-socket\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310291 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-slash\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.310708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310308 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovnkube-script-lib\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.311410 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310358 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-cni-bin\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.311410 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310479 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.311410 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310505 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a343c3d6-4d4e-4e8e-8658-c59308d09601-systemd-units\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.311410 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.310758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4a975d1a-4be7-41a5-b7fd-95561bba816e-serviceca\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.312500 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.312476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a343c3d6-4d4e-4e8e-8658-c59308d09601-ovn-node-metrics-cert\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.319547 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.319487 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:29.319547 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.319511 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:29.319547 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.319526 2580 projected.go:194] Error preparing data for projected volume kube-api-access-hwk69 for pod openshift-network-diagnostics/network-check-target-hnxw4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:29.319741 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.319589 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69 podName:b60522d0-bdd5-4710-961d-66c6a6112265 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:29.819571715 +0000 UTC m=+3.193746121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hwk69" (UniqueName: "kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69") pod "network-check-target-hnxw4" (UID: "b60522d0-bdd5-4710-961d-66c6a6112265") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:29.320717 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.320696 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tkw5\" (UniqueName: \"kubernetes.io/projected/75516b17-54a7-403c-b9a7-20ae8a32ebb7-kube-api-access-2tkw5\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:29.321723 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.321704 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snxpl\" (UniqueName: \"kubernetes.io/projected/a343c3d6-4d4e-4e8e-8658-c59308d09601-kube-api-access-snxpl\") pod \"ovnkube-node-5r9pl\" (UID: \"a343c3d6-4d4e-4e8e-8658-c59308d09601\") " pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.322555 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.322533 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfhp\" (UniqueName: \"kubernetes.io/projected/d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f-kube-api-access-9dfhp\") pod \"node-resolver-qnz92\" (UID: \"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f\") " pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.322777 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.322758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxxqg\" (UniqueName: \"kubernetes.io/projected/4a975d1a-4be7-41a5-b7fd-95561bba816e-kube-api-access-vxxqg\") pod \"node-ca-mxk78\" (UID: \"4a975d1a-4be7-41a5-b7fd-95561bba816e\") " pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.364486 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.364452 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:29.395300 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.395268 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:29.413227 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.413192 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" Apr 16 17:40:29.421290 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.421258 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" Apr 16 17:40:29.428901 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.428881 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-blkts" Apr 16 17:40:29.435670 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.435638 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w72d2" Apr 16 17:40:29.444335 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.444299 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-78497" Apr 16 17:40:29.451027 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.451000 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qnz92" Apr 16 17:40:29.457982 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.457720 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mxk78" Apr 16 17:40:29.462450 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.462429 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:29.811780 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.811691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:29.811935 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.811798 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:29.811935 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.811851 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs podName:75516b17-54a7-403c-b9a7-20ae8a32ebb7 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:30.811838028 +0000 UTC m=+4.186012435 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs") pod "network-metrics-daemon-2st9k" (UID: "75516b17-54a7-403c-b9a7-20ae8a32ebb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:29.912438 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:29.912402 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:29.912607 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.912576 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:29.912607 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.912597 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:29.912607 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.912607 2580 projected.go:194] Error preparing data for projected volume kube-api-access-hwk69 for pod openshift-network-diagnostics/network-check-target-hnxw4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:29.912747 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:29.912667 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69 podName:b60522d0-bdd5-4710-961d-66c6a6112265 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:30.912646605 +0000 UTC m=+4.286821026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hwk69" (UniqueName: "kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69") pod "network-check-target-hnxw4" (UID: "b60522d0-bdd5-4710-961d-66c6a6112265") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:30.020706 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.020582 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a975d1a_4be7_41a5_b7fd_95561bba816e.slice/crio-4c7b10e017683f9bbe9fe9759a6ffba9ad85203f9bf4ec46f2e0ea689698fed9 WatchSource:0}: Error finding container 4c7b10e017683f9bbe9fe9759a6ffba9ad85203f9bf4ec46f2e0ea689698fed9: Status 404 returned error can't find the container with id 4c7b10e017683f9bbe9fe9759a6ffba9ad85203f9bf4ec46f2e0ea689698fed9 Apr 16 17:40:30.021972 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.021935 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod434e8414_887d_4565_9d3b_620183c5537a.slice/crio-9fd749ea91c8704f512a2edac88f1f2499362ce56e270b658590f92ab0b83e80 WatchSource:0}: Error finding container 9fd749ea91c8704f512a2edac88f1f2499362ce56e270b658590f92ab0b83e80: Status 404 returned error can't find the container with id 9fd749ea91c8704f512a2edac88f1f2499362ce56e270b658590f92ab0b83e80 Apr 16 17:40:30.023955 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.023842 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeac9d41_8b69_46c4_bae0_8f7642ab0fd9.slice/crio-fa9b0c107277b667c9a6e89f5d2c2330b42860776715d4b7a84549e2f4fa01d4 WatchSource:0}: Error finding container fa9b0c107277b667c9a6e89f5d2c2330b42860776715d4b7a84549e2f4fa01d4: Status 404 returned error can't find the container with id fa9b0c107277b667c9a6e89f5d2c2330b42860776715d4b7a84549e2f4fa01d4 Apr 16 17:40:30.025759 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.025741 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7bc1548_2cb4_4ade_bff6_dbdeadb9d76f.slice/crio-bc7619a7609c82404d294a9aebd808c74609b5190b371be73dae7e643c491f39 WatchSource:0}: Error finding container bc7619a7609c82404d294a9aebd808c74609b5190b371be73dae7e643c491f39: Status 404 returned error can't find the container with id bc7619a7609c82404d294a9aebd808c74609b5190b371be73dae7e643c491f39 Apr 16 17:40:30.026770 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.026667 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75665a5_d922_4bab_a0cb_028ff1c31c7a.slice/crio-044bf25e9861f4a9ddce2e01c7dd374c015fd3b2536ec371e3936f3924c59c83 WatchSource:0}: Error finding container 044bf25e9861f4a9ddce2e01c7dd374c015fd3b2536ec371e3936f3924c59c83: Status 404 returned error can't find the container with id 044bf25e9861f4a9ddce2e01c7dd374c015fd3b2536ec371e3936f3924c59c83 Apr 16 17:40:30.047882 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.047756 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872f1cbb_5c2b_4f0b_a6f4_1d4693e07813.slice/crio-6c52081fee5cfbe6b72991fe1908a45b3e884b88bfbe22f36a81447f1aea49b3 WatchSource:0}: Error finding container 6c52081fee5cfbe6b72991fe1908a45b3e884b88bfbe22f36a81447f1aea49b3: Status 404 returned error can't find the container with id 6c52081fee5cfbe6b72991fe1908a45b3e884b88bfbe22f36a81447f1aea49b3 Apr 16 17:40:30.048432 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.048397 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21cfe301_09d7_4af8_8050_a3969d8eb2db.slice/crio-ed16ab9bae43dacc8751a1e34c8aa7298ff25563a661295a5878b202dc9a6a68 WatchSource:0}: Error finding container ed16ab9bae43dacc8751a1e34c8aa7298ff25563a661295a5878b202dc9a6a68: Status 404 returned error can't find the container with id ed16ab9bae43dacc8751a1e34c8aa7298ff25563a661295a5878b202dc9a6a68 Apr 16 17:40:30.049292 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.049270 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda343c3d6_4d4e_4e8e_8658_c59308d09601.slice/crio-eb4d8255ec417f71c85f3690849af07df2a1d2abb9ad1bc6c265db48f6bd1a67 WatchSource:0}: Error finding container eb4d8255ec417f71c85f3690849af07df2a1d2abb9ad1bc6c265db48f6bd1a67: Status 404 returned error can't find the container with id eb4d8255ec417f71c85f3690849af07df2a1d2abb9ad1bc6c265db48f6bd1a67 Apr 16 17:40:30.050233 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:40:30.050205 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2461943_ed32_4619_8936_d5df421841b4.slice/crio-17e84fb3c9bf4d2cb6b0ccf3677f6b7d384ade3af5fd3461017b01419bbeaae7 WatchSource:0}: Error finding container 17e84fb3c9bf4d2cb6b0ccf3677f6b7d384ade3af5fd3461017b01419bbeaae7: Status 404 returned error can't find the container with id 17e84fb3c9bf4d2cb6b0ccf3677f6b7d384ade3af5fd3461017b01419bbeaae7 Apr 16 17:40:30.139089 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.138884 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:35:28 +0000 UTC" deadline="2027-12-18 15:09:50.40638101 +0000 UTC" Apr 16 17:40:30.139089 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.139087 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14661h29m20.267299125s" Apr 16 17:40:30.258324 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.258283 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:30.258515 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:30.258429 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:30.264962 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.264931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blkts" event={"ID":"434e8414-887d-4565-9d3b-620183c5537a","Type":"ContainerStarted","Data":"9fd749ea91c8704f512a2edac88f1f2499362ce56e270b658590f92ab0b83e80"} Apr 16 17:40:30.266130 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.266096 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-78497" event={"ID":"c2461943-ed32-4619-8936-d5df421841b4","Type":"ContainerStarted","Data":"17e84fb3c9bf4d2cb6b0ccf3677f6b7d384ade3af5fd3461017b01419bbeaae7"} Apr 16 17:40:30.267344 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.267298 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"eb4d8255ec417f71c85f3690849af07df2a1d2abb9ad1bc6c265db48f6bd1a67"} Apr 16 17:40:30.268418 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.268386 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w72d2" event={"ID":"21cfe301-09d7-4af8-8050-a3969d8eb2db","Type":"ContainerStarted","Data":"ed16ab9bae43dacc8751a1e34c8aa7298ff25563a661295a5878b202dc9a6a68"} Apr 16 17:40:30.269330 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.269307 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" event={"ID":"d75665a5-d922-4bab-a0cb-028ff1c31c7a","Type":"ContainerStarted","Data":"044bf25e9861f4a9ddce2e01c7dd374c015fd3b2536ec371e3936f3924c59c83"} Apr 16 17:40:30.270321 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.270285 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnz92" event={"ID":"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f","Type":"ContainerStarted","Data":"bc7619a7609c82404d294a9aebd808c74609b5190b371be73dae7e643c491f39"} Apr 16 17:40:30.271200 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.271180 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mxk78" event={"ID":"4a975d1a-4be7-41a5-b7fd-95561bba816e","Type":"ContainerStarted","Data":"4c7b10e017683f9bbe9fe9759a6ffba9ad85203f9bf4ec46f2e0ea689698fed9"} Apr 16 17:40:30.272877 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.272855 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" event={"ID":"3d0025abf26159e90c9c59e296cbe6be","Type":"ContainerStarted","Data":"b610e7f1e831160bdfc7dbefb40667e98db0726c3a9c81e90e591bc44014ee17"} Apr 16 17:40:30.273851 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.273827 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qkj9z" event={"ID":"872f1cbb-5c2b-4f0b-a6f4-1d4693e07813","Type":"ContainerStarted","Data":"6c52081fee5cfbe6b72991fe1908a45b3e884b88bfbe22f36a81447f1aea49b3"} Apr 16 17:40:30.274917 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.274885 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" event={"ID":"feac9d41-8b69-46c4-bae0-8f7642ab0fd9","Type":"ContainerStarted","Data":"fa9b0c107277b667c9a6e89f5d2c2330b42860776715d4b7a84549e2f4fa01d4"} Apr 16 17:40:30.289173 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.289097 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-216.ec2.internal" podStartSLOduration=2.289078333 podStartE2EDuration="2.289078333s" podCreationTimestamp="2026-04-16 17:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:30.288713314 +0000 UTC m=+3.662887743" watchObservedRunningTime="2026-04-16 17:40:30.289078333 +0000 UTC m=+3.663252762" Apr 16 17:40:30.818870 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.818825 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:30.819053 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:30.819002 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:30.819133 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:30.819065 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs podName:75516b17-54a7-403c-b9a7-20ae8a32ebb7 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:32.819045896 +0000 UTC m=+6.193220308 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs") pod "network-metrics-daemon-2st9k" (UID: "75516b17-54a7-403c-b9a7-20ae8a32ebb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:30.924477 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:30.924437 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:30.924653 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:30.924611 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:30.924653 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:30.924633 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:30.924653 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:30.924645 2580 projected.go:194] Error preparing data for projected volume kube-api-access-hwk69 for pod openshift-network-diagnostics/network-check-target-hnxw4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:30.924811 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:30.924704 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69 podName:b60522d0-bdd5-4710-961d-66c6a6112265 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:32.924685283 +0000 UTC m=+6.298859706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hwk69" (UniqueName: "kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69") pod "network-check-target-hnxw4" (UID: "b60522d0-bdd5-4710-961d-66c6a6112265") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:31.261063 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:31.261020 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:31.261538 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:31.261174 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:31.284187 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:31.283090 2580 generic.go:358] "Generic (PLEG): container finished" podID="cbf533753c31efbf1e7c032d253d1486" containerID="370d0278dbb8a6f5143cf9c6f5f9d3cee7bdfa8f858b9bb578b9aef1768c4c79" exitCode=0 Apr 16 17:40:31.284187 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:31.284057 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" event={"ID":"cbf533753c31efbf1e7c032d253d1486","Type":"ContainerDied","Data":"370d0278dbb8a6f5143cf9c6f5f9d3cee7bdfa8f858b9bb578b9aef1768c4c79"} Apr 16 17:40:32.259052 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:32.259016 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:32.259284 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:32.259187 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:32.302262 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:32.302206 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" event={"ID":"cbf533753c31efbf1e7c032d253d1486","Type":"ContainerStarted","Data":"3378349e538b2b5a2f8153126f60f94c1275d2ed8e73299a401a68d38b7ba61d"} Apr 16 17:40:32.319705 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:32.319458 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-216.ec2.internal" podStartSLOduration=4.319438519 podStartE2EDuration="4.319438519s" podCreationTimestamp="2026-04-16 17:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:32.318468224 +0000 UTC m=+5.692642654" watchObservedRunningTime="2026-04-16 17:40:32.319438519 +0000 UTC m=+5.693612949" Apr 16 17:40:32.840861 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:32.840273 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:32.840861 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:32.840449 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:32.840861 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:32.840514 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs podName:75516b17-54a7-403c-b9a7-20ae8a32ebb7 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:36.840495039 +0000 UTC m=+10.214669460 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs") pod "network-metrics-daemon-2st9k" (UID: "75516b17-54a7-403c-b9a7-20ae8a32ebb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:32.942054 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:32.941637 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:32.942054 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:32.941847 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:32.942054 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:32.941866 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:32.942054 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:32.941878 2580 projected.go:194] Error preparing data for projected volume kube-api-access-hwk69 for pod openshift-network-diagnostics/network-check-target-hnxw4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:32.942054 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:32.941939 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69 podName:b60522d0-bdd5-4710-961d-66c6a6112265 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:36.941921442 +0000 UTC m=+10.316095864 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hwk69" (UniqueName: "kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69") pod "network-check-target-hnxw4" (UID: "b60522d0-bdd5-4710-961d-66c6a6112265") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:33.259776 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:33.259691 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:33.259914 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:33.259832 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:34.259801 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:34.259262 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:34.259801 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:34.259416 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:35.258644 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:35.258610 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:35.258833 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:35.258738 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:36.258252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:36.258215 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:36.258734 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:36.258354 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:36.873928 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:36.873883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:36.874112 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:36.874050 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:36.874209 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:36.874125 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs podName:75516b17-54a7-403c-b9a7-20ae8a32ebb7 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:44.874103412 +0000 UTC m=+18.248277823 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs") pod "network-metrics-daemon-2st9k" (UID: "75516b17-54a7-403c-b9a7-20ae8a32ebb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:36.974908 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:36.974867 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:36.975099 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:36.975046 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:36.975099 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:36.975064 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:36.975099 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:36.975076 2580 projected.go:194] Error preparing data for projected volume kube-api-access-hwk69 for pod openshift-network-diagnostics/network-check-target-hnxw4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:36.975302 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:36.975142 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69 podName:b60522d0-bdd5-4710-961d-66c6a6112265 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:44.975121224 +0000 UTC m=+18.349295700 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hwk69" (UniqueName: "kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69") pod "network-check-target-hnxw4" (UID: "b60522d0-bdd5-4710-961d-66c6a6112265") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:37.260839 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:37.260360 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:37.260839 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:37.260480 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:38.258617 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:38.258572 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:38.258792 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:38.258737 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:39.258715 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:39.258682 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:39.259139 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:39.258798 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:40.259217 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:40.259178 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:40.259672 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:40.259297 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:41.258725 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:41.258597 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:41.258725 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:41.258712 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:42.258240 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:42.258206 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:42.258761 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:42.258334 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:43.259265 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:43.259236 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:43.259684 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:43.259335 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:44.258733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:44.258689 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:44.258920 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:44.258843 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:44.932215 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:44.932176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:44.932626 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:44.932341 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:44.932626 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:44.932411 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs podName:75516b17-54a7-403c-b9a7-20ae8a32ebb7 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:00.932392386 +0000 UTC m=+34.306566813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs") pod "network-metrics-daemon-2st9k" (UID: "75516b17-54a7-403c-b9a7-20ae8a32ebb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:45.033310 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:45.033274 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:45.033471 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:45.033409 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:45.033471 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:45.033423 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:45.033471 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:45.033432 2580 projected.go:194] Error preparing data for projected volume kube-api-access-hwk69 for pod openshift-network-diagnostics/network-check-target-hnxw4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:45.033607 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:45.033486 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69 podName:b60522d0-bdd5-4710-961d-66c6a6112265 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:01.033470935 +0000 UTC m=+34.407645357 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hwk69" (UniqueName: "kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69") pod "network-check-target-hnxw4" (UID: "b60522d0-bdd5-4710-961d-66c6a6112265") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:45.259640 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:45.258759 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:45.259640 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:45.258902 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:46.258856 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:46.258816 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:46.259347 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:46.258962 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:47.261572 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:47.261546 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:47.261925 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:47.261676 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:48.259326 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.258981 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:48.259476 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:48.259420 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:48.332861 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.332823 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qkj9z" event={"ID":"872f1cbb-5c2b-4f0b-a6f4-1d4693e07813","Type":"ContainerStarted","Data":"13033ecec60c8116d4e81d0c3e8e706c5b989654953f0e81abdd19576cf669be"} Apr 16 17:40:48.334237 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.334205 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" event={"ID":"feac9d41-8b69-46c4-bae0-8f7642ab0fd9","Type":"ContainerStarted","Data":"a35a4f919f9c20c5c3f631990c4c32c80196d94b25112d5bd45a9c6f7cefc0ea"} Apr 16 17:40:48.335441 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.335416 2580 generic.go:358] "Generic (PLEG): container finished" podID="434e8414-887d-4565-9d3b-620183c5537a" containerID="354424cd0aab63a6a04727b5f27ae36412dcf61f381575b5816fd743e513439a" exitCode=0 Apr 16 17:40:48.335581 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.335496 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blkts" event={"ID":"434e8414-887d-4565-9d3b-620183c5537a","Type":"ContainerDied","Data":"354424cd0aab63a6a04727b5f27ae36412dcf61f381575b5816fd743e513439a"} Apr 16 17:40:48.337886 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.337867 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"cf42c94425b8ca4729cb303fb12503a72887032f481deff426c2e8f94b277279"} Apr 16 17:40:48.337986 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.337892 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"c8f7012c9f837316525ac269a12addac78ebe21cfa74bbb2b92cd8f5d91baeae"} Apr 16 17:40:48.337986 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.337901 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"13d1ed8911e5755131d1fa9d1d10e80bb8d6efa7bd5f13cbf80ec238bdd6089a"} Apr 16 17:40:48.337986 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.337910 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"8ae557eb237263bb435eba13895ff13d0602e3b57b4bf3b9a7d54ae18a015e37"} Apr 16 17:40:48.337986 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.337918 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"6bac282fd44d86227998194f654673d846beebf3780332c6ec26147388c7fe51"} Apr 16 17:40:48.341318 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.341285 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w72d2" event={"ID":"21cfe301-09d7-4af8-8050-a3969d8eb2db","Type":"ContainerStarted","Data":"aa8d2afecb4853b0ced7a0547f028033e261d903065df16e82d80add3fbaf44a"} Apr 16 17:40:48.342509 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.342491 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" event={"ID":"d75665a5-d922-4bab-a0cb-028ff1c31c7a","Type":"ContainerStarted","Data":"d59cde24815e19035e007abc62f3c2325f2a621f07e3331c783dd4d5d255aa74"} Apr 16 17:40:48.343615 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.343590 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnz92" event={"ID":"d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f","Type":"ContainerStarted","Data":"63a6b63e2772a34e5d0ab04440b0ad09f81cda900b7f4d5dada590dea85f0e15"} Apr 16 17:40:48.344680 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.344664 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mxk78" event={"ID":"4a975d1a-4be7-41a5-b7fd-95561bba816e","Type":"ContainerStarted","Data":"2bc93a74c1c6b90aee5b2b664ba2d2d7738cd3d2f14197dbb0833839250f3320"} Apr 16 17:40:48.352517 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.352476 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qkj9z" podStartSLOduration=4.185724935 podStartE2EDuration="21.352465917s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.049671567 +0000 UTC m=+3.423845972" lastFinishedPulling="2026-04-16 17:40:47.216412535 +0000 UTC m=+20.590586954" observedRunningTime="2026-04-16 17:40:48.35243042 +0000 UTC m=+21.726604848" watchObservedRunningTime="2026-04-16 17:40:48.352465917 +0000 UTC m=+21.726640345" Apr 16 17:40:48.387911 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.387865 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qnz92" podStartSLOduration=4.217296323 podStartE2EDuration="21.387851322s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.046060945 +0000 UTC m=+3.420235351" lastFinishedPulling="2026-04-16 17:40:47.21661593 +0000 UTC m=+20.590790350" observedRunningTime="2026-04-16 17:40:48.387484506 +0000 UTC m=+21.761658933" watchObservedRunningTime="2026-04-16 17:40:48.387851322 +0000 UTC m=+21.762025750" Apr 16 17:40:48.401878 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.401838 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mxk78" podStartSLOduration=8.674123741 podStartE2EDuration="21.401823225s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.022473665 +0000 UTC m=+3.396648085" lastFinishedPulling="2026-04-16 17:40:42.75017316 +0000 UTC m=+16.124347569" observedRunningTime="2026-04-16 17:40:48.401741846 +0000 UTC m=+21.775916274" watchObservedRunningTime="2026-04-16 17:40:48.401823225 +0000 UTC m=+21.775997653" Apr 16 17:40:48.418535 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.418491 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zdhfq" podStartSLOduration=4.224609745 podStartE2EDuration="21.418476916s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.0253991 +0000 UTC m=+3.399573519" lastFinishedPulling="2026-04-16 17:40:47.219266278 +0000 UTC m=+20.593440690" observedRunningTime="2026-04-16 17:40:48.418365105 +0000 UTC m=+21.792539533" watchObservedRunningTime="2026-04-16 17:40:48.418476916 +0000 UTC m=+21.792651352" Apr 16 17:40:48.437045 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.436988 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w72d2" podStartSLOduration=4.201913598 podStartE2EDuration="21.436974637s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.053390528 +0000 UTC m=+3.427564937" lastFinishedPulling="2026-04-16 17:40:47.288451556 +0000 UTC m=+20.662625976" observedRunningTime="2026-04-16 17:40:48.436456285 +0000 UTC m=+21.810630714" watchObservedRunningTime="2026-04-16 17:40:48.436974637 +0000 UTC m=+21.811149064" Apr 16 17:40:48.839200 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:48.839095 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 17:40:49.177547 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:49.177446 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T17:40:48.839128701Z","UUID":"95a4ff93-7f4a-4ce2-ba56-4e372e526cd8","Handler":null,"Name":"","Endpoint":""} Apr 16 17:40:49.179018 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:49.178988 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 17:40:49.179018 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:49.179023 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 17:40:49.259319 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:49.259275 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:49.259493 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:49.259415 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:49.348958 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:49.348919 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-78497" event={"ID":"c2461943-ed32-4619-8936-d5df421841b4","Type":"ContainerStarted","Data":"2ce14775e3244d09f3e21c785ac927fe4cce9f49abe9468c96b8539a968f7564"} Apr 16 17:40:49.352112 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:49.352080 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"db9e120f2c7347631941a3a85b043800335a2bf5b96dcb4363717d00cac11b9b"} Apr 16 17:40:49.354023 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:49.353882 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" event={"ID":"d75665a5-d922-4bab-a0cb-028ff1c31c7a","Type":"ContainerStarted","Data":"533dd8469ac256c7892a45275c9dcac019f6a5429870372d51f8594679b98b0a"} Apr 16 17:40:50.258810 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:50.258586 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:50.258976 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:50.258932 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:50.358175 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:50.358071 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" event={"ID":"d75665a5-d922-4bab-a0cb-028ff1c31c7a","Type":"ContainerStarted","Data":"b7be7e5dbb4def74437ee8f1176cd1c15bf031416bcf60592c909e3777baba73"} Apr 16 17:40:50.378020 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:50.377962 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-78497" podStartSLOduration=6.212497419 podStartE2EDuration="23.377946618s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.053348978 +0000 UTC m=+3.427523384" lastFinishedPulling="2026-04-16 17:40:47.218798172 +0000 UTC m=+20.592972583" observedRunningTime="2026-04-16 17:40:49.364955497 +0000 UTC m=+22.739129927" watchObservedRunningTime="2026-04-16 17:40:50.377946618 +0000 UTC m=+23.752121046" Apr 16 17:40:50.378210 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:50.378077 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gg87q" podStartSLOduration=3.367944178 podStartE2EDuration="23.378074081s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.046058855 +0000 UTC m=+3.420233261" lastFinishedPulling="2026-04-16 17:40:50.056188752 +0000 UTC m=+23.430363164" observedRunningTime="2026-04-16 17:40:50.377564564 +0000 UTC m=+23.751738993" watchObservedRunningTime="2026-04-16 17:40:50.378074081 +0000 UTC m=+23.752248508" Apr 16 17:40:51.258937 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:51.258413 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:51.258937 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:51.258553 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:51.366556 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:51.366518 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"d173087d2e0ace18e9b08bacf3d3ad1a0ea24ad22c200ae2c6936a6de39d90c2"} Apr 16 17:40:51.824965 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:51.824920 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:51.825606 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:51.825575 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:52.258993 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:52.258958 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:52.259191 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:52.259114 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:52.368784 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:52.368753 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:52.369291 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:52.369146 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qkj9z" Apr 16 17:40:53.258755 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:53.258589 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:53.258923 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:53.258825 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:53.373006 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:53.372970 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" event={"ID":"a343c3d6-4d4e-4e8e-8658-c59308d09601","Type":"ContainerStarted","Data":"7c652f5c0899a477c1049a95c3f0b3feaea239d92ff3f006b63ebd10b18b527b"} Apr 16 17:40:53.373632 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:53.373279 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:53.374625 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:53.374602 2580 generic.go:358] "Generic (PLEG): container finished" podID="434e8414-887d-4565-9d3b-620183c5537a" containerID="87d108d2c7480f4cc9749e50b95ac2f1ea040a4dc7ab933f4513dac5bb47fef0" exitCode=0 Apr 16 17:40:53.374733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:53.374658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blkts" event={"ID":"434e8414-887d-4565-9d3b-620183c5537a","Type":"ContainerDied","Data":"87d108d2c7480f4cc9749e50b95ac2f1ea040a4dc7ab933f4513dac5bb47fef0"} Apr 16 17:40:53.390009 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:53.389987 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:53.402888 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:53.402844 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" podStartSLOduration=8.90523057 podStartE2EDuration="26.402832489s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.053349783 +0000 UTC m=+3.427524193" lastFinishedPulling="2026-04-16 17:40:47.550951703 +0000 UTC m=+20.925126112" observedRunningTime="2026-04-16 17:40:53.402510101 +0000 UTC m=+26.776684529" watchObservedRunningTime="2026-04-16 17:40:53.402832489 +0000 UTC m=+26.777006917" Apr 16 17:40:54.258707 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:54.258674 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:54.258884 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:54.258773 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:54.376569 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:54.376538 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 17:40:54.377064 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:54.377045 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:54.391624 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:54.391601 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:55.258452 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:55.258421 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:55.258714 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:55.258531 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:55.379719 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:55.379681 2580 generic.go:358] "Generic (PLEG): container finished" podID="434e8414-887d-4565-9d3b-620183c5537a" containerID="7f908e3863664f8907488581fc2a22146c93727a7677d71adef9ca0e8af827f8" exitCode=0 Apr 16 17:40:55.380104 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:55.379758 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blkts" event={"ID":"434e8414-887d-4565-9d3b-620183c5537a","Type":"ContainerDied","Data":"7f908e3863664f8907488581fc2a22146c93727a7677d71adef9ca0e8af827f8"} Apr 16 17:40:55.380104 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:55.379861 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 17:40:56.258641 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:56.258608 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:56.258810 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:56.258718 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:56.381662 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:56.381634 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 17:40:56.948582 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:56.948537 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:40:57.259776 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:57.259691 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:57.259918 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:57.259779 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:40:57.385687 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:57.385649 2580 generic.go:358] "Generic (PLEG): container finished" podID="434e8414-887d-4565-9d3b-620183c5537a" containerID="1b398ef7ed1c7e2a43c885af87d06527a227b071a6c6146de50850c24acb5400" exitCode=0 Apr 16 17:40:57.386066 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:57.385722 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blkts" event={"ID":"434e8414-887d-4565-9d3b-620183c5537a","Type":"ContainerDied","Data":"1b398ef7ed1c7e2a43c885af87d06527a227b071a6c6146de50850c24acb5400"} Apr 16 17:40:57.399357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:57.399316 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" podUID="a343c3d6-4d4e-4e8e-8658-c59308d09601" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 17:40:58.258946 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:58.258915 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:40:58.259129 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:58.259065 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:40:59.258635 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:40:59.258599 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:40:59.259136 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:40:59.258703 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:41:00.258497 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:00.258455 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:00.258728 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:00.258600 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:41:00.947013 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:00.946971 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:00.947341 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:00.947113 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:00.947341 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:00.947201 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs podName:75516b17-54a7-403c-b9a7-20ae8a32ebb7 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:32.947179732 +0000 UTC m=+66.321354140 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs") pod "network-metrics-daemon-2st9k" (UID: "75516b17-54a7-403c-b9a7-20ae8a32ebb7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:01.047378 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:01.047332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:01.047560 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:01.047538 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:01.047624 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:01.047566 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:01.047624 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:01.047576 2580 projected.go:194] Error preparing data for projected volume kube-api-access-hwk69 for pod openshift-network-diagnostics/network-check-target-hnxw4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:01.047710 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:01.047632 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69 podName:b60522d0-bdd5-4710-961d-66c6a6112265 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:33.047611416 +0000 UTC m=+66.421785823 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hwk69" (UniqueName: "kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69") pod "network-check-target-hnxw4" (UID: "b60522d0-bdd5-4710-961d-66c6a6112265") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:01.259307 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:01.259219 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:01.259754 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:01.259346 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:41:02.258824 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:02.258785 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:02.259009 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:02.258924 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:41:03.258850 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:03.258818 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:03.259252 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:03.258966 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:41:03.399168 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:03.399111 2580 generic.go:358] "Generic (PLEG): container finished" podID="434e8414-887d-4565-9d3b-620183c5537a" containerID="0ee47ccd6422b18a62d04a45cf31653f58dc67cc4db243b7abea32f9ee6f42d7" exitCode=0 Apr 16 17:41:03.399328 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:03.399198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blkts" event={"ID":"434e8414-887d-4565-9d3b-620183c5537a","Type":"ContainerDied","Data":"0ee47ccd6422b18a62d04a45cf31653f58dc67cc4db243b7abea32f9ee6f42d7"} Apr 16 17:41:04.259083 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:04.259052 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:04.259449 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:04.259189 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:41:04.403509 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:04.403473 2580 generic.go:358] "Generic (PLEG): container finished" podID="434e8414-887d-4565-9d3b-620183c5537a" containerID="7caec9132b8828cc0f48b660f3ec2fbaa28de7fd7586124bb3915f80f6bb9d34" exitCode=0 Apr 16 17:41:04.403696 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:04.403518 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blkts" event={"ID":"434e8414-887d-4565-9d3b-620183c5537a","Type":"ContainerDied","Data":"7caec9132b8828cc0f48b660f3ec2fbaa28de7fd7586124bb3915f80f6bb9d34"} Apr 16 17:41:04.801741 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:04.800696 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2st9k"] Apr 16 17:41:04.801741 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:04.800867 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:04.801741 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:04.801031 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:41:04.814091 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:04.814059 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hnxw4"] Apr 16 17:41:04.814299 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:04.814281 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:04.814425 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:04.814404 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:41:05.409048 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:05.409016 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blkts" event={"ID":"434e8414-887d-4565-9d3b-620183c5537a","Type":"ContainerStarted","Data":"2fab00bdbf37ef95169ba32b7b9772d7c70a4b9497e9fb7d5ac4388995e5212b"} Apr 16 17:41:05.432790 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:05.432741 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-blkts" podStartSLOduration=5.53810656 podStartE2EDuration="38.432725229s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:40:30.024412351 +0000 UTC m=+3.398586775" lastFinishedPulling="2026-04-16 17:41:02.919031031 +0000 UTC m=+36.293205444" observedRunningTime="2026-04-16 17:41:05.432529236 +0000 UTC m=+38.806703664" watchObservedRunningTime="2026-04-16 17:41:05.432725229 +0000 UTC m=+38.806899656" Apr 16 17:41:06.259057 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:06.259027 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:06.259251 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:06.259134 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:41:07.259841 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:07.259806 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:07.260282 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:07.259893 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:41:08.258388 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:08.258195 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:08.258592 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:08.258494 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2st9k" podUID="75516b17-54a7-403c-b9a7-20ae8a32ebb7" Apr 16 17:41:09.258585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.258548 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:09.258973 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:09.258681 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hnxw4" podUID="b60522d0-bdd5-4710-961d-66c6a6112265" Apr 16 17:41:09.406084 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.405987 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-216.ec2.internal" event="NodeReady" Apr 16 17:41:09.406252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.406096 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 17:41:09.454578 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.454545 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-74mzq"] Apr 16 17:41:09.456329 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.456312 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.457630 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.457608 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xq8lz"] Apr 16 17:41:09.458367 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.458348 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 17:41:09.458464 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.458391 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qhzh2\"" Apr 16 17:41:09.458526 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.458502 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 17:41:09.459194 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.459178 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xq8lz" Apr 16 17:41:09.461043 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.461021 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zdjnc\"" Apr 16 17:41:09.461195 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.461069 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 17:41:09.461195 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.461102 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 17:41:09.461195 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.461119 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 17:41:09.468371 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.468352 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-74mzq"] Apr 16 17:41:09.474267 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.472124 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xq8lz"] Apr 16 17:41:09.612908 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.612846 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zccz\" (UniqueName: \"kubernetes.io/projected/3bb57ce2-2350-45e9-b686-a1f3b5c5c84e-kube-api-access-5zccz\") pod \"ingress-canary-xq8lz\" (UID: \"3bb57ce2-2350-45e9-b686-a1f3b5c5c84e\") " pod="openshift-ingress-canary/ingress-canary-xq8lz" Apr 16 17:41:09.612908 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.612911 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b8b74ff2-67c6-488d-b82d-55527e9c4661-tmp-dir\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.613126 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.612982 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8b74ff2-67c6-488d-b82d-55527e9c4661-metrics-tls\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.613126 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.613017 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bb57ce2-2350-45e9-b686-a1f3b5c5c84e-cert\") pod \"ingress-canary-xq8lz\" (UID: \"3bb57ce2-2350-45e9-b686-a1f3b5c5c84e\") " pod="openshift-ingress-canary/ingress-canary-xq8lz" Apr 16 17:41:09.613126 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.613050 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8b74ff2-67c6-488d-b82d-55527e9c4661-config-volume\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.613126 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.613066 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zbrs\" (UniqueName: \"kubernetes.io/projected/b8b74ff2-67c6-488d-b82d-55527e9c4661-kube-api-access-6zbrs\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.714299 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.714265 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zccz\" (UniqueName: \"kubernetes.io/projected/3bb57ce2-2350-45e9-b686-a1f3b5c5c84e-kube-api-access-5zccz\") pod \"ingress-canary-xq8lz\" (UID: \"3bb57ce2-2350-45e9-b686-a1f3b5c5c84e\") " pod="openshift-ingress-canary/ingress-canary-xq8lz" Apr 16 17:41:09.714417 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.714314 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b8b74ff2-67c6-488d-b82d-55527e9c4661-tmp-dir\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.714417 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.714345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8b74ff2-67c6-488d-b82d-55527e9c4661-metrics-tls\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.714417 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.714360 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bb57ce2-2350-45e9-b686-a1f3b5c5c84e-cert\") pod \"ingress-canary-xq8lz\" (UID: \"3bb57ce2-2350-45e9-b686-a1f3b5c5c84e\") " pod="openshift-ingress-canary/ingress-canary-xq8lz" Apr 16 17:41:09.714417 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.714390 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8b74ff2-67c6-488d-b82d-55527e9c4661-config-volume\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.714417 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.714406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zbrs\" (UniqueName: \"kubernetes.io/projected/b8b74ff2-67c6-488d-b82d-55527e9c4661-kube-api-access-6zbrs\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.714769 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.714746 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b8b74ff2-67c6-488d-b82d-55527e9c4661-tmp-dir\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.715681 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.715646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8b74ff2-67c6-488d-b82d-55527e9c4661-config-volume\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.718461 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.718428 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8b74ff2-67c6-488d-b82d-55527e9c4661-metrics-tls\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.718570 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.718489 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bb57ce2-2350-45e9-b686-a1f3b5c5c84e-cert\") pod \"ingress-canary-xq8lz\" (UID: \"3bb57ce2-2350-45e9-b686-a1f3b5c5c84e\") " pod="openshift-ingress-canary/ingress-canary-xq8lz" Apr 16 17:41:09.724535 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.724504 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zbrs\" (UniqueName: \"kubernetes.io/projected/b8b74ff2-67c6-488d-b82d-55527e9c4661-kube-api-access-6zbrs\") pod \"dns-default-74mzq\" (UID: \"b8b74ff2-67c6-488d-b82d-55527e9c4661\") " pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.724768 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.724743 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zccz\" (UniqueName: \"kubernetes.io/projected/3bb57ce2-2350-45e9-b686-a1f3b5c5c84e-kube-api-access-5zccz\") pod \"ingress-canary-xq8lz\" (UID: \"3bb57ce2-2350-45e9-b686-a1f3b5c5c84e\") " pod="openshift-ingress-canary/ingress-canary-xq8lz" Apr 16 17:41:09.767169 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.767130 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:09.773926 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.773895 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xq8lz" Apr 16 17:41:09.908422 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.908391 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-74mzq"] Apr 16 17:41:09.910878 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:09.910854 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xq8lz"] Apr 16 17:41:09.912077 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:09.912053 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b74ff2_67c6_488d_b82d_55527e9c4661.slice/crio-cc927f55d6ab69a4f0cb0635583d165027702aa2fd60006468ab1a8d7cdd5019 WatchSource:0}: Error finding container cc927f55d6ab69a4f0cb0635583d165027702aa2fd60006468ab1a8d7cdd5019: Status 404 returned error can't find the container with id cc927f55d6ab69a4f0cb0635583d165027702aa2fd60006468ab1a8d7cdd5019 Apr 16 17:41:09.914050 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:09.914024 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb57ce2_2350_45e9_b686_a1f3b5c5c84e.slice/crio-ef0d7367839d67cfc285f839dda7e904795663d403585ec0650daa90b00064c6 WatchSource:0}: Error finding container ef0d7367839d67cfc285f839dda7e904795663d403585ec0650daa90b00064c6: Status 404 returned error can't find the container with id ef0d7367839d67cfc285f839dda7e904795663d403585ec0650daa90b00064c6 Apr 16 17:41:10.258622 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:10.258588 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:10.261638 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:10.261619 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:41:10.261702 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:10.261639 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdxwt\"" Apr 16 17:41:10.419580 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:10.419545 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xq8lz" event={"ID":"3bb57ce2-2350-45e9-b686-a1f3b5c5c84e","Type":"ContainerStarted","Data":"ef0d7367839d67cfc285f839dda7e904795663d403585ec0650daa90b00064c6"} Apr 16 17:41:10.421036 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:10.421000 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-74mzq" event={"ID":"b8b74ff2-67c6-488d-b82d-55527e9c4661","Type":"ContainerStarted","Data":"cc927f55d6ab69a4f0cb0635583d165027702aa2fd60006468ab1a8d7cdd5019"} Apr 16 17:41:11.258451 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:11.258417 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:11.260592 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:11.260570 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:41:11.260961 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:11.260682 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvrx4\"" Apr 16 17:41:11.261330 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:11.261309 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:41:12.427579 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.427369 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xq8lz" event={"ID":"3bb57ce2-2350-45e9-b686-a1f3b5c5c84e","Type":"ContainerStarted","Data":"6de073ef04a8bb4cdaf8b4e4da90f92e52d2abff49348ec998d6324de24e2734"} Apr 16 17:41:12.429434 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.429386 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-74mzq" event={"ID":"b8b74ff2-67c6-488d-b82d-55527e9c4661","Type":"ContainerStarted","Data":"1635fa3c88d51ca386b4b07c976f0de1f401b8d17d2f133c0183de12e0550b9f"} Apr 16 17:41:12.434238 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.434209 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7rzmg"] Apr 16 17:41:12.450531 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.450481 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xq8lz" podStartSLOduration=1.177228893 podStartE2EDuration="3.450464765s" podCreationTimestamp="2026-04-16 17:41:09 +0000 UTC" firstStartedPulling="2026-04-16 17:41:09.915747432 +0000 UTC m=+43.289921837" lastFinishedPulling="2026-04-16 17:41:12.188983291 +0000 UTC m=+45.563157709" observedRunningTime="2026-04-16 17:41:12.449650523 +0000 UTC m=+45.823824950" watchObservedRunningTime="2026-04-16 17:41:12.450464765 +0000 UTC m=+45.824639190" Apr 16 17:41:12.451021 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.450997 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7rzmg"] Apr 16 17:41:12.451179 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.451140 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.453283 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.453262 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 17:41:12.453439 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.453414 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 17:41:12.453558 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.453541 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-wxp4x\"" Apr 16 17:41:12.453558 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.453544 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 17:41:12.453558 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.453542 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 17:41:12.453712 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.453616 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 17:41:12.636367 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.636325 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stztm\" (UniqueName: \"kubernetes.io/projected/8038cd78-0be0-4711-b1bc-f799dd11b41d-kube-api-access-stztm\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.636367 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.636364 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8038cd78-0be0-4711-b1bc-f799dd11b41d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.636569 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.636391 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8038cd78-0be0-4711-b1bc-f799dd11b41d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.636569 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.636489 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8038cd78-0be0-4711-b1bc-f799dd11b41d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.738143 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.737868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8038cd78-0be0-4711-b1bc-f799dd11b41d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.738143 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.738010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8038cd78-0be0-4711-b1bc-f799dd11b41d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.738143 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.738104 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8038cd78-0be0-4711-b1bc-f799dd11b41d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.739016 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.738411 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stztm\" (UniqueName: \"kubernetes.io/projected/8038cd78-0be0-4711-b1bc-f799dd11b41d-kube-api-access-stztm\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.740406 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.740342 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8038cd78-0be0-4711-b1bc-f799dd11b41d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.747929 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.747868 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8038cd78-0be0-4711-b1bc-f799dd11b41d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.748288 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.748009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8038cd78-0be0-4711-b1bc-f799dd11b41d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.762009 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.761947 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stztm\" (UniqueName: \"kubernetes.io/projected/8038cd78-0be0-4711-b1bc-f799dd11b41d-kube-api-access-stztm\") pod \"prometheus-operator-78f957474d-7rzmg\" (UID: \"8038cd78-0be0-4711-b1bc-f799dd11b41d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.773626 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.773549 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" Apr 16 17:41:12.955501 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:12.955309 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7rzmg"] Apr 16 17:41:12.959897 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:12.959872 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8038cd78_0be0_4711_b1bc_f799dd11b41d.slice/crio-b610f1b3fa55a4a9f0a8abbd0aee64988fd1751fbeaca021a9b3d718aae49c87 WatchSource:0}: Error finding container b610f1b3fa55a4a9f0a8abbd0aee64988fd1751fbeaca021a9b3d718aae49c87: Status 404 returned error can't find the container with id b610f1b3fa55a4a9f0a8abbd0aee64988fd1751fbeaca021a9b3d718aae49c87 Apr 16 17:41:13.432499 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.432467 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-74mzq" event={"ID":"b8b74ff2-67c6-488d-b82d-55527e9c4661","Type":"ContainerStarted","Data":"b6e91a26c077b5bae95477a4f359c0084ad0b42f96ed2ffee6940baab3fa77f1"} Apr 16 17:41:13.433191 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.432592 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:13.433512 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.433484 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" event={"ID":"8038cd78-0be0-4711-b1bc-f799dd11b41d","Type":"ContainerStarted","Data":"b610f1b3fa55a4a9f0a8abbd0aee64988fd1751fbeaca021a9b3d718aae49c87"} Apr 16 17:41:13.451582 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.451526 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-74mzq" podStartSLOduration=2.181503216 podStartE2EDuration="4.451511565s" podCreationTimestamp="2026-04-16 17:41:09 +0000 UTC" firstStartedPulling="2026-04-16 17:41:09.914073979 +0000 UTC m=+43.288248385" lastFinishedPulling="2026-04-16 17:41:12.184082314 +0000 UTC m=+45.558256734" observedRunningTime="2026-04-16 17:41:13.450275009 +0000 UTC m=+46.824449438" watchObservedRunningTime="2026-04-16 17:41:13.451511565 +0000 UTC m=+46.825685993" Apr 16 17:41:13.529296 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.529261 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-548656dbdd-jjqn5"] Apr 16 17:41:13.551736 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.551704 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-548656dbdd-jjqn5"] Apr 16 17:41:13.551911 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.551849 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.553934 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.553910 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 17:41:13.554496 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.554469 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 17:41:13.554625 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.554563 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 17:41:13.554625 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.554598 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 17:41:13.554737 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.554626 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 17:41:13.554737 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.554625 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lwftg\"" Apr 16 17:41:13.554737 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.554564 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 17:41:13.554737 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.554734 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 17:41:13.559967 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.559945 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 17:41:13.646720 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.646684 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-oauth-config\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.646895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.646736 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-serving-cert\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.646895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.646791 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-trusted-ca-bundle\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.646895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.646826 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-config\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.646895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.646855 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-service-ca\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.646895 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.646877 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdvjx\" (UniqueName: \"kubernetes.io/projected/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-kube-api-access-tdvjx\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.647066 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.646907 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-oauth-serving-cert\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.748250 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.748139 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-oauth-serving-cert\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.748250 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.748237 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-oauth-config\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.748474 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.748284 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-serving-cert\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.748474 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.748308 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-trusted-ca-bundle\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.748474 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.748342 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-config\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.748474 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.748366 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-service-ca\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.748474 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.748405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdvjx\" (UniqueName: \"kubernetes.io/projected/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-kube-api-access-tdvjx\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.748917 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.748888 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-oauth-serving-cert\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.752199 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.752151 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-oauth-config\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.752199 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.752181 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-serving-cert\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.759739 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.759714 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-service-ca\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.759859 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.759790 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-config\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.759967 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.759943 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-trusted-ca-bundle\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.771334 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.771304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdvjx\" (UniqueName: \"kubernetes.io/projected/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-kube-api-access-tdvjx\") pod \"console-548656dbdd-jjqn5\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:13.876274 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:13.876230 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:14.024953 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:14.024876 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-548656dbdd-jjqn5"] Apr 16 17:41:14.328371 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:14.328289 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d5cf3a_53b4_4373_b554_3935ed9f47d6.slice/crio-030227eecaa8fbce9b65912691579f0b8d51909d32377bf96f79ae681300e995 WatchSource:0}: Error finding container 030227eecaa8fbce9b65912691579f0b8d51909d32377bf96f79ae681300e995: Status 404 returned error can't find the container with id 030227eecaa8fbce9b65912691579f0b8d51909d32377bf96f79ae681300e995 Apr 16 17:41:14.436050 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:14.436011 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-548656dbdd-jjqn5" event={"ID":"d1d5cf3a-53b4-4373-b554-3935ed9f47d6","Type":"ContainerStarted","Data":"030227eecaa8fbce9b65912691579f0b8d51909d32377bf96f79ae681300e995"} Apr 16 17:41:14.593937 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:14.593848 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qnz92_d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f/dns-node-resolver/0.log" Apr 16 17:41:15.439752 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:15.439714 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" event={"ID":"8038cd78-0be0-4711-b1bc-f799dd11b41d","Type":"ContainerStarted","Data":"6f70860b56b866a7de64681ba218b6048093638f0eb6f43098ae49322216db04"} Apr 16 17:41:15.439752 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:15.439756 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" event={"ID":"8038cd78-0be0-4711-b1bc-f799dd11b41d","Type":"ContainerStarted","Data":"01f9417ff0a8eb0524a0ad954fbc69a95b108718df8050b3f7064cf6292e6ce5"} Apr 16 17:41:15.459308 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:15.459249 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-7rzmg" podStartSLOduration=1.717337544 podStartE2EDuration="3.459235075s" podCreationTimestamp="2026-04-16 17:41:12 +0000 UTC" firstStartedPulling="2026-04-16 17:41:12.961616582 +0000 UTC m=+46.335790988" lastFinishedPulling="2026-04-16 17:41:14.70351411 +0000 UTC m=+48.077688519" observedRunningTime="2026-04-16 17:41:15.458420609 +0000 UTC m=+48.832595037" watchObservedRunningTime="2026-04-16 17:41:15.459235075 +0000 UTC m=+48.833409502" Apr 16 17:41:15.794283 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:15.794200 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mxk78_4a975d1a-4be7-41a5-b7fd-95561bba816e/node-ca/0.log" Apr 16 17:41:16.594314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:16.594280 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xq8lz_3bb57ce2-2350-45e9-b686-a1f3b5c5c84e/serve-healthcheck-canary/0.log" Apr 16 17:41:17.792384 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.792146 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-688vs"] Apr 16 17:41:17.800202 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.800181 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.802214 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.802194 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 17:41:17.802332 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.802284 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-4pfdx\"" Apr 16 17:41:17.802386 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.802373 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 17:41:17.805601 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.805577 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-688vs"] Apr 16 17:41:17.808888 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.808866 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-79cxz"] Apr 16 17:41:17.816445 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.816425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.818877 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.818851 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 17:41:17.818877 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.818858 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 17:41:17.819042 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.819012 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 17:41:17.819042 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.819022 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-dz9mw\"" Apr 16 17:41:17.826847 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.826824 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-79cxz"] Apr 16 17:41:17.833327 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.833306 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ndrq9"] Apr 16 17:41:17.839743 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.839724 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.841923 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.841901 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 17:41:17.841923 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.841913 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 17:41:17.842082 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.841920 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wqdvb\"" Apr 16 17:41:17.842082 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.842006 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 17:41:17.882356 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.882356 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882358 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef001566-3c97-441f-9199-22bb6149bb4a-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.882560 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882390 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-tls\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882560 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882424 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882560 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882444 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wcdw\" (UniqueName: \"kubernetes.io/projected/30adbe00-e7ec-49f8-a027-027cac12b3e9-kube-api-access-5wcdw\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882560 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882462 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-wtmp\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882560 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882479 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-accelerators-collector-config\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882560 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.882733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882567 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.882733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ef001566-3c97-441f-9199-22bb6149bb4a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.882733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882605 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttpds\" (UniqueName: \"kubernetes.io/projected/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-api-access-ttpds\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.882733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882621 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-sys\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882645 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-root\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882660 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30adbe00-e7ec-49f8-a027-027cac12b3e9-metrics-client-ca\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882733 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882717 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-textfile\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.882932 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882748 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a9ce56c7-1a84-4a39-b540-ec90e251a81a-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.882932 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882764 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9ce56c7-1a84-4a39-b540-ec90e251a81a-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.882932 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882781 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef001566-3c97-441f-9199-22bb6149bb4a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.882932 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.882797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjnm\" (UniqueName: \"kubernetes.io/projected/ef001566-3c97-441f-9199-22bb6149bb4a-kube-api-access-cbjnm\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.983260 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983222 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-textfile\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.983260 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983262 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a9ce56c7-1a84-4a39-b540-ec90e251a81a-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.983495 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983280 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9ce56c7-1a84-4a39-b540-ec90e251a81a-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.983495 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983298 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef001566-3c97-441f-9199-22bb6149bb4a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.983495 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983359 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjnm\" (UniqueName: \"kubernetes.io/projected/ef001566-3c97-441f-9199-22bb6149bb4a-kube-api-access-cbjnm\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.983495 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983412 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.983495 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:17.983495 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 17:41:17.983723 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:17.983552 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-tls podName:a9ce56c7-1a84-4a39-b540-ec90e251a81a nodeName:}" failed. No retries permitted until 2026-04-16 17:41:18.483534235 +0000 UTC m=+51.857708652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-79cxz" (UID: "a9ce56c7-1a84-4a39-b540-ec90e251a81a") : secret "kube-state-metrics-tls" not found Apr 16 17:41:17.983723 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef001566-3c97-441f-9199-22bb6149bb4a-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.983723 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983602 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-textfile\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.983723 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983645 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-tls\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.983723 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a9ce56c7-1a84-4a39-b540-ec90e251a81a-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.983723 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:17.983725 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wcdw\" (UniqueName: \"kubernetes.io/projected/30adbe00-e7ec-49f8-a027-027cac12b3e9-kube-api-access-5wcdw\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:17.983779 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-tls podName:30adbe00-e7ec-49f8-a027-027cac12b3e9 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:18.483760523 +0000 UTC m=+51.857934931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-tls") pod "node-exporter-ndrq9" (UID: "30adbe00-e7ec-49f8-a027-027cac12b3e9") : secret "node-exporter-tls" not found Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983800 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-wtmp\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983830 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-accelerators-collector-config\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ef001566-3c97-441f-9199-22bb6149bb4a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983976 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-wtmp\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.983985 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttpds\" (UniqueName: \"kubernetes.io/projected/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-api-access-ttpds\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984031 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9ce56c7-1a84-4a39-b540-ec90e251a81a-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.984056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984041 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-sys\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984629 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984070 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-root\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984629 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984112 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-root\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984629 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30adbe00-e7ec-49f8-a027-027cac12b3e9-sys\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984629 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984327 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef001566-3c97-441f-9199-22bb6149bb4a-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.984629 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984439 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-accelerators-collector-config\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.984629 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984453 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30adbe00-e7ec-49f8-a027-027cac12b3e9-metrics-client-ca\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.985110 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.984677 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.985196 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.985147 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30adbe00-e7ec-49f8-a027-027cac12b3e9-metrics-client-ca\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.987205 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.987175 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.987305 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.987220 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ef001566-3c97-441f-9199-22bb6149bb4a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.987305 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.987285 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef001566-3c97-441f-9199-22bb6149bb4a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.987305 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.987284 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:17.992641 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.992617 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjnm\" (UniqueName: \"kubernetes.io/projected/ef001566-3c97-441f-9199-22bb6149bb4a-kube-api-access-cbjnm\") pod \"openshift-state-metrics-5669946b84-688vs\" (UID: \"ef001566-3c97-441f-9199-22bb6149bb4a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:17.994016 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.993992 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wcdw\" (UniqueName: \"kubernetes.io/projected/30adbe00-e7ec-49f8-a027-027cac12b3e9-kube-api-access-5wcdw\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:17.994871 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:17.994848 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttpds\" (UniqueName: \"kubernetes.io/projected/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-api-access-ttpds\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:18.109822 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.109735 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" Apr 16 17:41:18.235111 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.235074 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-688vs"] Apr 16 17:41:18.239465 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:18.239435 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef001566_3c97_441f_9199_22bb6149bb4a.slice/crio-8d58e4fda9eb8a4b94cf1663900eb67e7713e430b793143b2028920563680dd2 WatchSource:0}: Error finding container 8d58e4fda9eb8a4b94cf1663900eb67e7713e430b793143b2028920563680dd2: Status 404 returned error can't find the container with id 8d58e4fda9eb8a4b94cf1663900eb67e7713e430b793143b2028920563680dd2 Apr 16 17:41:18.449925 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.449735 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-548656dbdd-jjqn5" event={"ID":"d1d5cf3a-53b4-4373-b554-3935ed9f47d6","Type":"ContainerStarted","Data":"dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48"} Apr 16 17:41:18.450921 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.450897 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" event={"ID":"ef001566-3c97-441f-9199-22bb6149bb4a","Type":"ContainerStarted","Data":"4159cbd9622e64a9cc0e07a10bc2ea9d2583e625538d4cca2f8533a79f2caa36"} Apr 16 17:41:18.451030 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.450925 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" event={"ID":"ef001566-3c97-441f-9199-22bb6149bb4a","Type":"ContainerStarted","Data":"8d58e4fda9eb8a4b94cf1663900eb67e7713e430b793143b2028920563680dd2"} Apr 16 17:41:18.468437 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.468383 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-548656dbdd-jjqn5" podStartSLOduration=2.266142055 podStartE2EDuration="5.468368409s" podCreationTimestamp="2026-04-16 17:41:13 +0000 UTC" firstStartedPulling="2026-04-16 17:41:14.330437844 +0000 UTC m=+47.704612250" lastFinishedPulling="2026-04-16 17:41:17.532664198 +0000 UTC m=+50.906838604" observedRunningTime="2026-04-16 17:41:18.467751303 +0000 UTC m=+51.841925730" watchObservedRunningTime="2026-04-16 17:41:18.468368409 +0000 UTC m=+51.842542836" Apr 16 17:41:18.489566 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.489533 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:18.489742 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.489585 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-tls\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:18.492777 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.492752 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/30adbe00-e7ec-49f8-a027-027cac12b3e9-node-exporter-tls\") pod \"node-exporter-ndrq9\" (UID: \"30adbe00-e7ec-49f8-a027-027cac12b3e9\") " pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:18.492875 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.492836 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9ce56c7-1a84-4a39-b540-ec90e251a81a-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-79cxz\" (UID: \"a9ce56c7-1a84-4a39-b540-ec90e251a81a\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:18.724564 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.724533 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" Apr 16 17:41:18.748054 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.748022 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ndrq9" Apr 16 17:41:18.859859 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.859825 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-79cxz"] Apr 16 17:41:18.863048 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:18.863011 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9ce56c7_1a84_4a39_b540_ec90e251a81a.slice/crio-939bc7075ad662a5883a412f6a0bc6955331c65232af191817056cc7e8568f97 WatchSource:0}: Error finding container 939bc7075ad662a5883a412f6a0bc6955331c65232af191817056cc7e8568f97: Status 404 returned error can't find the container with id 939bc7075ad662a5883a412f6a0bc6955331c65232af191817056cc7e8568f97 Apr 16 17:41:18.926815 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.926781 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:41:18.951596 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.951564 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.958651 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.958619 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 17:41:18.958651 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.958637 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 17:41:18.959555 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.959369 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 17:41:18.959555 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.959418 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 17:41:18.959555 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.959450 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 17:41:18.959735 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.959558 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 17:41:18.959735 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.959615 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 17:41:18.961066 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.961041 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:41:18.964740 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.964725 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 17:41:18.965365 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.965352 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-hvs78\"" Apr 16 17:41:18.968315 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.968300 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 17:41:18.994352 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994271 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994352 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994330 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08803585-a323-4ee6-80e0-b8a63b822ca2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-web-config\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994379 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08803585-a323-4ee6-80e0-b8a63b822ca2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994430 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/08803585-a323-4ee6-80e0-b8a63b822ca2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/08803585-a323-4ee6-80e0-b8a63b822ca2-config-out\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994486 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994510 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994760 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994570 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/08803585-a323-4ee6-80e0-b8a63b822ca2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994760 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994598 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-config-volume\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994760 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994624 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:18.994760 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:18.994648 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7kjm\" (UniqueName: \"kubernetes.io/projected/08803585-a323-4ee6-80e0-b8a63b822ca2-kube-api-access-f7kjm\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095752 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095720 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-web-config\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095752 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08803585-a323-4ee6-80e0-b8a63b822ca2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095776 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095801 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/08803585-a323-4ee6-80e0-b8a63b822ca2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095826 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/08803585-a323-4ee6-80e0-b8a63b822ca2-config-out\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095865 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095893 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:19.095906 2580 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095927 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/08803585-a323-4ee6-80e0-b8a63b822ca2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.095956 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-config-volume\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.095989 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:19.095972 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-main-tls podName:08803585-a323-4ee6-80e0-b8a63b822ca2 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:19.595951727 +0000 UTC m=+52.970126141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "08803585-a323-4ee6-80e0-b8a63b822ca2") : secret "alertmanager-main-tls" not found Apr 16 17:41:19.096402 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.096000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.096402 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.096036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7kjm\" (UniqueName: \"kubernetes.io/projected/08803585-a323-4ee6-80e0-b8a63b822ca2-kube-api-access-f7kjm\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.096402 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.096312 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.096402 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.096380 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08803585-a323-4ee6-80e0-b8a63b822ca2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.097585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.096701 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/08803585-a323-4ee6-80e0-b8a63b822ca2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.097585 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.097518 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08803585-a323-4ee6-80e0-b8a63b822ca2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.097986 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.097958 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08803585-a323-4ee6-80e0-b8a63b822ca2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.099813 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.099781 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.099894 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.099785 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/08803585-a323-4ee6-80e0-b8a63b822ca2-config-out\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.099894 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.099791 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-config-volume\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.099894 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.099877 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/08803585-a323-4ee6-80e0-b8a63b822ca2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.099894 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.099887 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.100135 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.100115 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.100204 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.100182 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.100677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.100661 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-web-config\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.105231 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.105209 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7kjm\" (UniqueName: \"kubernetes.io/projected/08803585-a323-4ee6-80e0-b8a63b822ca2-kube-api-access-f7kjm\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.456172 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.456114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ndrq9" event={"ID":"30adbe00-e7ec-49f8-a027-027cac12b3e9","Type":"ContainerStarted","Data":"7d7b313cf8e40701651f24b74b06d2228fa9f0bd9f45bc291442fe12e77f5063"} Apr 16 17:41:19.457980 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.457932 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" event={"ID":"ef001566-3c97-441f-9199-22bb6149bb4a","Type":"ContainerStarted","Data":"40590a20eb2f49e198e92b5eec1b42126605ef9a7c4c6bb3d42fa3b677ee551b"} Apr 16 17:41:19.459049 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.459022 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" event={"ID":"a9ce56c7-1a84-4a39-b540-ec90e251a81a","Type":"ContainerStarted","Data":"939bc7075ad662a5883a412f6a0bc6955331c65232af191817056cc7e8568f97"} Apr 16 17:41:19.600726 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.600690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.603656 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.603629 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/08803585-a323-4ee6-80e0-b8a63b822ca2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"08803585-a323-4ee6-80e0-b8a63b822ca2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.862872 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.862799 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 17:41:19.879762 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.879727 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw"] Apr 16 17:41:19.899626 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.899598 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw"] Apr 16 17:41:19.899787 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.899772 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:19.902252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.902226 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 17:41:19.902398 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.902263 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 17:41:19.902398 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.902269 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 17:41:19.902398 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.902317 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-bpxnt\"" Apr 16 17:41:19.902398 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.902355 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-coia0a726g40j\"" Apr 16 17:41:19.902642 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.902627 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 17:41:19.902712 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:19.902700 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 17:41:20.003985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.003947 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-grpc-tls\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.004171 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.004008 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.004171 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.004060 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.004171 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.004131 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.004304 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.004186 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.004304 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.004219 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rdg\" (UniqueName: \"kubernetes.io/projected/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-kube-api-access-b7rdg\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.004304 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.004257 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-tls\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.004304 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.004283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-metrics-client-ca\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.105213 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.105180 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.105365 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.105229 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.105365 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.105307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.105365 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.105347 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.105523 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.105376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7rdg\" (UniqueName: \"kubernetes.io/projected/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-kube-api-access-b7rdg\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.105523 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.105435 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-tls\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.105523 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.105460 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-metrics-client-ca\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.105711 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.105528 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-grpc-tls\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.115071 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.114797 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-grpc-tls\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.116251 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.115997 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.119252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.118697 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-metrics-client-ca\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.119362 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.119263 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.119735 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.119692 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-tls\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.120058 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.120040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.122075 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.121761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.125948 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.125926 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7rdg\" (UniqueName: \"kubernetes.io/projected/9c8951a4-4f69-4f1e-9150-e3ddb02d29b5-kube-api-access-b7rdg\") pod \"thanos-querier-fd5fbbd54-tdvhw\" (UID: \"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5\") " pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.210043 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.210005 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:20.229196 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.229130 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 17:41:20.231831 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:20.231798 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08803585_a323_4ee6_80e0_b8a63b822ca2.slice/crio-b07ac7cf59a03a113c9de5d46f05efb9c8ea761e1a8253a80cec21e94f7d6551 WatchSource:0}: Error finding container b07ac7cf59a03a113c9de5d46f05efb9c8ea761e1a8253a80cec21e94f7d6551: Status 404 returned error can't find the container with id b07ac7cf59a03a113c9de5d46f05efb9c8ea761e1a8253a80cec21e94f7d6551 Apr 16 17:41:20.359201 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.359115 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw"] Apr 16 17:41:20.463110 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.463060 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" event={"ID":"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5","Type":"ContainerStarted","Data":"ed0e11cab0d5c0301ac8a77672301f84ea9d977b51ec23d39d2fcab4a9d1cd6f"} Apr 16 17:41:20.464298 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.464270 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"08803585-a323-4ee6-80e0-b8a63b822ca2","Type":"ContainerStarted","Data":"b07ac7cf59a03a113c9de5d46f05efb9c8ea761e1a8253a80cec21e94f7d6551"} Apr 16 17:41:20.465764 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.465738 2580 generic.go:358] "Generic (PLEG): container finished" podID="30adbe00-e7ec-49f8-a027-027cac12b3e9" containerID="8b87248123b4bed591de37733086e3c74a2ac40bf3b228d4c85a836ea0851811" exitCode=0 Apr 16 17:41:20.465875 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.465767 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ndrq9" event={"ID":"30adbe00-e7ec-49f8-a027-027cac12b3e9","Type":"ContainerDied","Data":"8b87248123b4bed591de37733086e3c74a2ac40bf3b228d4c85a836ea0851811"} Apr 16 17:41:20.468132 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.468112 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" event={"ID":"ef001566-3c97-441f-9199-22bb6149bb4a","Type":"ContainerStarted","Data":"17244ad0223b8cc79b47471a3bd1796c0d0bebb98c740e60790e22e4dadfeabf"} Apr 16 17:41:20.504654 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:20.504603 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-688vs" podStartSLOduration=1.9569443899999999 podStartE2EDuration="3.504589075s" podCreationTimestamp="2026-04-16 17:41:17 +0000 UTC" firstStartedPulling="2026-04-16 17:41:18.526455254 +0000 UTC m=+51.900629661" lastFinishedPulling="2026-04-16 17:41:20.074099941 +0000 UTC m=+53.448274346" observedRunningTime="2026-04-16 17:41:20.503894475 +0000 UTC m=+53.878068903" watchObservedRunningTime="2026-04-16 17:41:20.504589075 +0000 UTC m=+53.878763514" Apr 16 17:41:21.476722 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:21.476510 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ndrq9" event={"ID":"30adbe00-e7ec-49f8-a027-027cac12b3e9","Type":"ContainerStarted","Data":"b30562918821393ac7c765401feab38a4da51817f853e3340fa3433a5eb1fbe6"} Apr 16 17:41:21.476722 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:21.476689 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ndrq9" event={"ID":"30adbe00-e7ec-49f8-a027-027cac12b3e9","Type":"ContainerStarted","Data":"5423956bc3ad0b141219478ca8f82cebc6c76a985736e8e7411d255755b13f12"} Apr 16 17:41:21.484618 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:21.484589 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" event={"ID":"a9ce56c7-1a84-4a39-b540-ec90e251a81a","Type":"ContainerStarted","Data":"c583e480884d9b71576cfe27762c3df6157e9e7c341e171f84ecb006f3989578"} Apr 16 17:41:21.484767 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:21.484623 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" event={"ID":"a9ce56c7-1a84-4a39-b540-ec90e251a81a","Type":"ContainerStarted","Data":"66d3af549ed886551010b409fc3fed7e97bb474ec3e37a14e3f148aaa58ad16b"} Apr 16 17:41:21.484767 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:21.484637 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" event={"ID":"a9ce56c7-1a84-4a39-b540-ec90e251a81a","Type":"ContainerStarted","Data":"fa0a7efa808ee169ba882ad46ca9b27506a6a6d109a705f07dd9db9682f27d02"} Apr 16 17:41:21.499114 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:21.498677 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ndrq9" podStartSLOduration=3.188108313 podStartE2EDuration="4.498658616s" podCreationTimestamp="2026-04-16 17:41:17 +0000 UTC" firstStartedPulling="2026-04-16 17:41:18.760937247 +0000 UTC m=+52.135111655" lastFinishedPulling="2026-04-16 17:41:20.07148755 +0000 UTC m=+53.445661958" observedRunningTime="2026-04-16 17:41:21.497243408 +0000 UTC m=+54.871417837" watchObservedRunningTime="2026-04-16 17:41:21.498658616 +0000 UTC m=+54.872833045" Apr 16 17:41:21.517504 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:21.516791 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-79cxz" podStartSLOduration=2.281258448 podStartE2EDuration="4.516770621s" podCreationTimestamp="2026-04-16 17:41:17 +0000 UTC" firstStartedPulling="2026-04-16 17:41:18.865567874 +0000 UTC m=+52.239742289" lastFinishedPulling="2026-04-16 17:41:21.101080049 +0000 UTC m=+54.475254462" observedRunningTime="2026-04-16 17:41:21.516328094 +0000 UTC m=+54.890502523" watchObservedRunningTime="2026-04-16 17:41:21.516770621 +0000 UTC m=+54.890945051" Apr 16 17:41:22.320321 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.320287 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-77bdf9f56d-5n2h8"] Apr 16 17:41:22.344048 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.344004 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-77bdf9f56d-5n2h8"] Apr 16 17:41:22.344287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.344115 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.347460 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.347137 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-xtbkp\"" Apr 16 17:41:22.347460 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.347194 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 17:41:22.347460 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.347203 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 17:41:22.347460 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.347224 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-a2ogrvko4n37h\"" Apr 16 17:41:22.347460 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.347236 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 17:41:22.347460 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.347304 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 17:41:22.429788 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.429755 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ps6k\" (UniqueName: \"kubernetes.io/projected/55588953-abf7-4d65-9e95-decf43201d1d-kube-api-access-7ps6k\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.429788 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.429795 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/55588953-abf7-4d65-9e95-decf43201d1d-metrics-server-audit-profiles\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.430042 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.429901 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55588953-abf7-4d65-9e95-decf43201d1d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.430042 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.429953 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-secret-metrics-server-client-certs\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.430042 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.430023 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/55588953-abf7-4d65-9e95-decf43201d1d-audit-log\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.430221 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.430096 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-secret-metrics-server-tls\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.430221 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.430123 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-client-ca-bundle\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.530789 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.530754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-secret-metrics-server-tls\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.531312 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.530860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-client-ca-bundle\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.531312 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.531000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ps6k\" (UniqueName: \"kubernetes.io/projected/55588953-abf7-4d65-9e95-decf43201d1d-kube-api-access-7ps6k\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.531312 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.531037 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/55588953-abf7-4d65-9e95-decf43201d1d-metrics-server-audit-profiles\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.531312 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.531147 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55588953-abf7-4d65-9e95-decf43201d1d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.531312 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.531231 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-secret-metrics-server-client-certs\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.531906 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.531858 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55588953-abf7-4d65-9e95-decf43201d1d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.532645 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.532141 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/55588953-abf7-4d65-9e95-decf43201d1d-metrics-server-audit-profiles\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.532645 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.532151 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/55588953-abf7-4d65-9e95-decf43201d1d-audit-log\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.532645 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.532550 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/55588953-abf7-4d65-9e95-decf43201d1d-audit-log\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.535601 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.535555 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-secret-metrics-server-client-certs\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.535601 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.535586 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-client-ca-bundle\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.535741 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.535637 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/55588953-abf7-4d65-9e95-decf43201d1d-secret-metrics-server-tls\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.544570 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.544536 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ps6k\" (UniqueName: \"kubernetes.io/projected/55588953-abf7-4d65-9e95-decf43201d1d-kube-api-access-7ps6k\") pod \"metrics-server-77bdf9f56d-5n2h8\" (UID: \"55588953-abf7-4d65-9e95-decf43201d1d\") " pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.582770 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.582687 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn"] Apr 16 17:41:22.618307 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.618274 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn"] Apr 16 17:41:22.618463 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.618440 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" Apr 16 17:41:22.620684 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.620652 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-wqsts\"" Apr 16 17:41:22.620994 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.620978 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 17:41:22.655812 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.655337 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:22.733389 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.733354 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7436ea53-0555-458d-bb0d-86a73624beff-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-2q6xn\" (UID: \"7436ea53-0555-458d-bb0d-86a73624beff\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" Apr 16 17:41:22.834783 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.834690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7436ea53-0555-458d-bb0d-86a73624beff-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-2q6xn\" (UID: \"7436ea53-0555-458d-bb0d-86a73624beff\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" Apr 16 17:41:22.837952 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.837920 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7436ea53-0555-458d-bb0d-86a73624beff-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-2q6xn\" (UID: \"7436ea53-0555-458d-bb0d-86a73624beff\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" Apr 16 17:41:22.930744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:22.930715 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" Apr 16 17:41:23.029751 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.029728 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-77bdf9f56d-5n2h8"] Apr 16 17:41:23.078855 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.078827 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn"] Apr 16 17:41:23.112344 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:23.112273 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55588953_abf7_4d65_9e95_decf43201d1d.slice/crio-8d9ad0808735bde1280a96a2fbd1aa9e39a94f6cd4221a4832254e8e0afac4ab WatchSource:0}: Error finding container 8d9ad0808735bde1280a96a2fbd1aa9e39a94f6cd4221a4832254e8e0afac4ab: Status 404 returned error can't find the container with id 8d9ad0808735bde1280a96a2fbd1aa9e39a94f6cd4221a4832254e8e0afac4ab Apr 16 17:41:23.112990 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:23.112915 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7436ea53_0555_458d_bb0d_86a73624beff.slice/crio-98a6ac354b545e382f2a1f53c1ed51266ac6f5d0ed3e91d26e4e9fd7cf36ab39 WatchSource:0}: Error finding container 98a6ac354b545e382f2a1f53c1ed51266ac6f5d0ed3e91d26e4e9fd7cf36ab39: Status 404 returned error can't find the container with id 98a6ac354b545e382f2a1f53c1ed51266ac6f5d0ed3e91d26e4e9fd7cf36ab39 Apr 16 17:41:23.439204 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.439171 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-74mzq" Apr 16 17:41:23.492062 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.492032 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" event={"ID":"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5","Type":"ContainerStarted","Data":"5b7cb40a0bf6c92030c9931092f89db2f573e960acdb990d8ccf9ae9c1b57b8a"} Apr 16 17:41:23.492062 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.492066 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" event={"ID":"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5","Type":"ContainerStarted","Data":"65560c4bb18bf6e6478aa1c1c7468381231ded2dd3fa4d3485ecedfb5a4e0525"} Apr 16 17:41:23.492311 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.492076 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" event={"ID":"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5","Type":"ContainerStarted","Data":"e372da9fd4a384a648f888eb8c0fe2f54a30cff3783ae97a80e80606c066bab9"} Apr 16 17:41:23.493425 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.493397 2580 generic.go:358] "Generic (PLEG): container finished" podID="08803585-a323-4ee6-80e0-b8a63b822ca2" containerID="a29bbe563105a41ebae256e9b340fceb403aa2d93b21f754983dbab818f01d06" exitCode=0 Apr 16 17:41:23.493550 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.493473 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"08803585-a323-4ee6-80e0-b8a63b822ca2","Type":"ContainerDied","Data":"a29bbe563105a41ebae256e9b340fceb403aa2d93b21f754983dbab818f01d06"} Apr 16 17:41:23.494566 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.494548 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" event={"ID":"7436ea53-0555-458d-bb0d-86a73624beff","Type":"ContainerStarted","Data":"98a6ac354b545e382f2a1f53c1ed51266ac6f5d0ed3e91d26e4e9fd7cf36ab39"} Apr 16 17:41:23.495696 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.495665 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" event={"ID":"55588953-abf7-4d65-9e95-decf43201d1d","Type":"ContainerStarted","Data":"8d9ad0808735bde1280a96a2fbd1aa9e39a94f6cd4221a4832254e8e0afac4ab"} Apr 16 17:41:23.755430 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.754757 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-548656dbdd-jjqn5"] Apr 16 17:41:23.783149 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.783116 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-756b5d68f9-npcpq"] Apr 16 17:41:23.808573 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.808375 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756b5d68f9-npcpq"] Apr 16 17:41:23.808573 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.808498 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.853294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.843326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-serving-cert\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.853294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.843375 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-console-config\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.853294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.843413 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-service-ca\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.853294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.843469 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-trusted-ca-bundle\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.853294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.843521 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw6tp\" (UniqueName: \"kubernetes.io/projected/6139c482-c4ae-4b18-b1d3-8809aca43004-kube-api-access-tw6tp\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.853294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.843588 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-oauth-serving-cert\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.853294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.843640 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-oauth-config\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.877272 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.877229 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:23.947314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.946711 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-oauth-config\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.947314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.946775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-serving-cert\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.947314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.946808 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-console-config\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.947314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.946842 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-service-ca\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.947314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.946895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-trusted-ca-bundle\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.947314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.946940 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw6tp\" (UniqueName: \"kubernetes.io/projected/6139c482-c4ae-4b18-b1d3-8809aca43004-kube-api-access-tw6tp\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.947314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.947006 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-oauth-serving-cert\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.947928 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.947903 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-oauth-serving-cert\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.948370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.948347 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-trusted-ca-bundle\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.948458 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.948392 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-service-ca\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.973313 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.963673 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-console-config\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.973313 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.967396 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw6tp\" (UniqueName: \"kubernetes.io/projected/6139c482-c4ae-4b18-b1d3-8809aca43004-kube-api-access-tw6tp\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.973313 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.967913 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-oauth-config\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:23.973313 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:23.970126 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-serving-cert\") pod \"console-756b5d68f9-npcpq\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:24.120582 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.120493 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:24.168512 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.168456 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:41:24.190055 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.190018 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.191734 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.190487 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:41:24.197265 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.193339 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2o40h8jb6e7mh\"" Apr 16 17:41:24.197265 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.196699 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 17:41:24.197626 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.197602 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 17:41:24.197882 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.197864 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 17:41:24.199192 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.198071 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 17:41:24.199192 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.198240 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 17:41:24.199192 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.198370 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 17:41:24.199192 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.198618 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-km58q\"" Apr 16 17:41:24.199192 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.198750 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 17:41:24.199192 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.198869 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 17:41:24.199192 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.198997 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 17:41:24.199882 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.199750 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 17:41:24.203273 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.203248 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 17:41:24.204979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.204944 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 17:41:24.251832 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.251789 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.251991 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.251837 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfpdn\" (UniqueName: \"kubernetes.io/projected/a3188937-155b-4c43-acda-ef56b6d9499b-kube-api-access-cfpdn\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.251991 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.251870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.251991 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.251900 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a3188937-155b-4c43-acda-ef56b6d9499b-config-out\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.251991 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.251927 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.251991 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.251957 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.251991 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.251984 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252015 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252039 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252064 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-web-config\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252119 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252148 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252192 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252243 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252282 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252324 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252366 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a3188937-155b-4c43-acda-ef56b6d9499b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.252395 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.252391 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-config\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.342840 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.342800 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756b5d68f9-npcpq"] Apr 16 17:41:24.353294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353266 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353434 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353316 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-web-config\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353434 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353381 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353434 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353410 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353611 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353438 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353611 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353611 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353512 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353611 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353549 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353611 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353590 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a3188937-155b-4c43-acda-ef56b6d9499b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-config\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353697 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfpdn\" (UniqueName: \"kubernetes.io/projected/a3188937-155b-4c43-acda-ef56b6d9499b-kube-api-access-cfpdn\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353755 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a3188937-155b-4c43-acda-ef56b6d9499b-config-out\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353817 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.353862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353845 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.354278 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.353878 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.359840 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.354468 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.359840 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.355471 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.359840 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.356080 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.359840 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.359006 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.363442 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:24.363170 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6139c482_c4ae_4b18_b1d3_8809aca43004.slice/crio-cc2d547a4ae66d867e258cb1c76c4d79847ca744b01209a0cf9b1f8d79f526cc WatchSource:0}: Error finding container cc2d547a4ae66d867e258cb1c76c4d79847ca744b01209a0cf9b1f8d79f526cc: Status 404 returned error can't find the container with id cc2d547a4ae66d867e258cb1c76c4d79847ca744b01209a0cf9b1f8d79f526cc Apr 16 17:41:24.365196 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.365028 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.366859 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.366748 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.369959 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.369936 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.370100 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.370031 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-web-config\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.374348 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.374326 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a3188937-155b-4c43-acda-ef56b6d9499b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.379903 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.379877 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.380003 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.379927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-config\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.380245 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.380212 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a3188937-155b-4c43-acda-ef56b6d9499b-config-out\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.380326 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.380309 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.380401 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.380379 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a3188937-155b-4c43-acda-ef56b6d9499b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.380553 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.380531 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.380751 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.380726 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.380835 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.380789 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a3188937-155b-4c43-acda-ef56b6d9499b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.383560 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.383512 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfpdn\" (UniqueName: \"kubernetes.io/projected/a3188937-155b-4c43-acda-ef56b6d9499b-kube-api-access-cfpdn\") pod \"prometheus-k8s-0\" (UID: \"a3188937-155b-4c43-acda-ef56b6d9499b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.510454 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.509982 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:24.512121 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:24.512080 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756b5d68f9-npcpq" event={"ID":"6139c482-c4ae-4b18-b1d3-8809aca43004","Type":"ContainerStarted","Data":"cc2d547a4ae66d867e258cb1c76c4d79847ca744b01209a0cf9b1f8d79f526cc"} Apr 16 17:41:25.517976 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:25.517936 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756b5d68f9-npcpq" event={"ID":"6139c482-c4ae-4b18-b1d3-8809aca43004","Type":"ContainerStarted","Data":"7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b"} Apr 16 17:41:25.540297 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:25.540240 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-756b5d68f9-npcpq" podStartSLOduration=2.540217612 podStartE2EDuration="2.540217612s" podCreationTimestamp="2026-04-16 17:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:25.539626223 +0000 UTC m=+58.913800653" watchObservedRunningTime="2026-04-16 17:41:25.540217612 +0000 UTC m=+58.914392041" Apr 16 17:41:26.259375 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.259324 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:41:26.263453 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:26.263396 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3188937_155b_4c43_acda_ef56b6d9499b.slice/crio-e97d80c863479d376d79e95647239292a98a85aa491864001aea28801a624cd6 WatchSource:0}: Error finding container e97d80c863479d376d79e95647239292a98a85aa491864001aea28801a624cd6: Status 404 returned error can't find the container with id e97d80c863479d376d79e95647239292a98a85aa491864001aea28801a624cd6 Apr 16 17:41:26.524764 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.524736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" event={"ID":"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5","Type":"ContainerStarted","Data":"9459ad0c22bec679296b7e72f02acfaa03820a84a86f89c3ddbda139ef5574c5"} Apr 16 17:41:26.525065 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.524774 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" event={"ID":"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5","Type":"ContainerStarted","Data":"b333c9300e0d0eb4ab387809c9a9d35fee0e8b0a557dca46bd7fb4db3f27ff06"} Apr 16 17:41:26.525065 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.524786 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" event={"ID":"9c8951a4-4f69-4f1e-9150-e3ddb02d29b5","Type":"ContainerStarted","Data":"a4df679883b4c82391b5ca66f761fac1bde02a31fa24ded75359c282d8b5981c"} Apr 16 17:41:26.525065 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.524943 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:26.528643 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.528621 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"08803585-a323-4ee6-80e0-b8a63b822ca2","Type":"ContainerStarted","Data":"afc31e909925e41f2737324204ec60baa766a03f50a7845cd633bd9bc7ffc976"} Apr 16 17:41:26.528764 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.528652 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"08803585-a323-4ee6-80e0-b8a63b822ca2","Type":"ContainerStarted","Data":"e93bad906edcc8fa80fa344381d89f2a9559e7b8d13e00ee1c8994cf2cfe6885"} Apr 16 17:41:26.528764 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.528667 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"08803585-a323-4ee6-80e0-b8a63b822ca2","Type":"ContainerStarted","Data":"6c16f69c7e545ff3d976c39193fed5c0e2ac21e4ff655e6ff68cefbacce5c204"} Apr 16 17:41:26.528764 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.528679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"08803585-a323-4ee6-80e0-b8a63b822ca2","Type":"ContainerStarted","Data":"98e12eec8490ff831ebfb23eaf1aa55c1e025aec57f2cdb7a05ff48bfb9f6102"} Apr 16 17:41:26.528764 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.528690 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"08803585-a323-4ee6-80e0-b8a63b822ca2","Type":"ContainerStarted","Data":"c421202af1dba86c919769f7d133c9706a06910371af292a16b839d00e07b235"} Apr 16 17:41:26.530252 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.530175 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" event={"ID":"7436ea53-0555-458d-bb0d-86a73624beff","Type":"ContainerStarted","Data":"30fd58a37758b13299b043eb28503168ce73607af16e4b0ed62e2d1d92845a8b"} Apr 16 17:41:26.530427 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.530409 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" Apr 16 17:41:26.533383 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.533356 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" event={"ID":"55588953-abf7-4d65-9e95-decf43201d1d","Type":"ContainerStarted","Data":"a496ea38afde453678b1f35c05f6c938883131e14eed7f048ec26baba4a16b3f"} Apr 16 17:41:26.535008 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.534985 2580 generic.go:358] "Generic (PLEG): container finished" podID="a3188937-155b-4c43-acda-ef56b6d9499b" containerID="b390b7bf7de0934068ddf16eaad965b7b8ea7a68d6f8b09e62a3ca0a466fe4a4" exitCode=0 Apr 16 17:41:26.535121 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.535037 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a3188937-155b-4c43-acda-ef56b6d9499b","Type":"ContainerDied","Data":"b390b7bf7de0934068ddf16eaad965b7b8ea7a68d6f8b09e62a3ca0a466fe4a4"} Apr 16 17:41:26.535121 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.535062 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a3188937-155b-4c43-acda-ef56b6d9499b","Type":"ContainerStarted","Data":"e97d80c863479d376d79e95647239292a98a85aa491864001aea28801a624cd6"} Apr 16 17:41:26.537120 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.536818 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" Apr 16 17:41:26.555187 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.555077 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" podStartSLOduration=1.842961423 podStartE2EDuration="7.555058496s" podCreationTimestamp="2026-04-16 17:41:19 +0000 UTC" firstStartedPulling="2026-04-16 17:41:20.368356089 +0000 UTC m=+53.742530495" lastFinishedPulling="2026-04-16 17:41:26.080453147 +0000 UTC m=+59.454627568" observedRunningTime="2026-04-16 17:41:26.554176087 +0000 UTC m=+59.928350515" watchObservedRunningTime="2026-04-16 17:41:26.555058496 +0000 UTC m=+59.929232925" Apr 16 17:41:26.574897 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.574850 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" podStartSLOduration=1.617884699 podStartE2EDuration="4.57483383s" podCreationTimestamp="2026-04-16 17:41:22 +0000 UTC" firstStartedPulling="2026-04-16 17:41:23.123425055 +0000 UTC m=+56.497599462" lastFinishedPulling="2026-04-16 17:41:26.080374187 +0000 UTC m=+59.454548593" observedRunningTime="2026-04-16 17:41:26.574585085 +0000 UTC m=+59.948759514" watchObservedRunningTime="2026-04-16 17:41:26.57483383 +0000 UTC m=+59.949008256" Apr 16 17:41:26.638666 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:26.638614 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-2q6xn" podStartSLOduration=1.681689634 podStartE2EDuration="4.638596883s" podCreationTimestamp="2026-04-16 17:41:22 +0000 UTC" firstStartedPulling="2026-04-16 17:41:23.123425908 +0000 UTC m=+56.497600328" lastFinishedPulling="2026-04-16 17:41:26.080333166 +0000 UTC m=+59.454507577" observedRunningTime="2026-04-16 17:41:26.637592964 +0000 UTC m=+60.011767391" watchObservedRunningTime="2026-04-16 17:41:26.638596883 +0000 UTC m=+60.012771312" Apr 16 17:41:27.108738 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.108705 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ftrk9"] Apr 16 17:41:27.113469 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.113445 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.115494 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.115470 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 17:41:27.115698 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.115475 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 17:41:27.115810 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.115783 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.115888 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.115858 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.115888 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.115786 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kg8l7\"" Apr 16 17:41:27.125370 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.125240 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ftrk9"] Apr 16 17:41:27.183711 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.183681 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-data-volume\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.183875 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.183796 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-crio-socket\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.183875 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.183826 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.183975 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.183943 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.184023 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.183992 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhd9x\" (UniqueName: \"kubernetes.io/projected/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-kube-api-access-vhd9x\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.284678 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.284633 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.284853 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.284707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhd9x\" (UniqueName: \"kubernetes.io/projected/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-kube-api-access-vhd9x\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.284853 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.284734 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-data-volume\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.284853 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:27.284788 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:41:27.285022 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.284860 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-crio-socket\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.285022 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:27.284867 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-insights-runtime-extractor-tls podName:fca7dc6e-d365-471c-9ee4-67a409ac1c9e nodeName:}" failed. No retries permitted until 2026-04-16 17:41:27.784845656 +0000 UTC m=+61.159020062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ftrk9" (UID: "fca7dc6e-d365-471c-9ee4-67a409ac1c9e") : secret "insights-runtime-extractor-tls" not found Apr 16 17:41:27.285022 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.284801 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-crio-socket\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.285022 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.284923 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.285242 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.285194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-data-volume\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.285554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.285537 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.295236 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.295208 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhd9x\" (UniqueName: \"kubernetes.io/projected/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-kube-api-access-vhd9x\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.398404 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.398316 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5r9pl" Apr 16 17:41:27.542571 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.542520 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"08803585-a323-4ee6-80e0-b8a63b822ca2","Type":"ContainerStarted","Data":"4cf132523ae83790337436c425d2bc0685e20ded9b638ad0a1cb2d95c957d823"} Apr 16 17:41:27.790410 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.790375 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:27.793526 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:27.793499 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fca7dc6e-d365-471c-9ee4-67a409ac1c9e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ftrk9\" (UID: \"fca7dc6e-d365-471c-9ee4-67a409ac1c9e\") " pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:28.030433 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:28.030402 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ftrk9" Apr 16 17:41:28.175301 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:28.175232 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.325019816 podStartE2EDuration="10.175210866s" podCreationTimestamp="2026-04-16 17:41:18 +0000 UTC" firstStartedPulling="2026-04-16 17:41:20.235148929 +0000 UTC m=+53.609323338" lastFinishedPulling="2026-04-16 17:41:26.085339971 +0000 UTC m=+59.459514388" observedRunningTime="2026-04-16 17:41:27.5780183 +0000 UTC m=+60.952192727" watchObservedRunningTime="2026-04-16 17:41:28.175210866 +0000 UTC m=+61.549385298" Apr 16 17:41:28.176091 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:28.176032 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ftrk9"] Apr 16 17:41:28.178887 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:28.178853 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca7dc6e_d365_471c_9ee4_67a409ac1c9e.slice/crio-476aeba4f98db202d5809a18c45d652fbb8d480778dcfa6d3e444013d2a29f72 WatchSource:0}: Error finding container 476aeba4f98db202d5809a18c45d652fbb8d480778dcfa6d3e444013d2a29f72: Status 404 returned error can't find the container with id 476aeba4f98db202d5809a18c45d652fbb8d480778dcfa6d3e444013d2a29f72 Apr 16 17:41:28.547835 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:28.547793 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ftrk9" event={"ID":"fca7dc6e-d365-471c-9ee4-67a409ac1c9e","Type":"ContainerStarted","Data":"2d110351ff078dd875ea935b9983e03cbc9832be9dc93ad93d9520ec3cd05423"} Apr 16 17:41:28.548289 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:28.547846 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ftrk9" event={"ID":"fca7dc6e-d365-471c-9ee4-67a409ac1c9e","Type":"ContainerStarted","Data":"476aeba4f98db202d5809a18c45d652fbb8d480778dcfa6d3e444013d2a29f72"} Apr 16 17:41:29.557541 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:29.555914 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a3188937-155b-4c43-acda-ef56b6d9499b","Type":"ContainerStarted","Data":"f8d36b7a209e0c60c31e1a7904cd55c73562feb3f2a5b7afa1af56a6a862161d"} Apr 16 17:41:30.562369 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:30.562321 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a3188937-155b-4c43-acda-ef56b6d9499b","Type":"ContainerStarted","Data":"7ef954f75bc812c5f3bc423c9619ee8cea333ac01c55c2cd28c2b71a4f291a4c"} Apr 16 17:41:30.562369 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:30.562361 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a3188937-155b-4c43-acda-ef56b6d9499b","Type":"ContainerStarted","Data":"dd895dbd8446c09467ef9147f1fb1d9c6762661755f5e471f2c96044eccddff3"} Apr 16 17:41:30.562369 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:30.562374 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a3188937-155b-4c43-acda-ef56b6d9499b","Type":"ContainerStarted","Data":"c2680e9c662d8165637c8ccaa0695ccb16a1ae945ef80eab35c59b53aa05b1d4"} Apr 16 17:41:30.562971 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:30.562387 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a3188937-155b-4c43-acda-ef56b6d9499b","Type":"ContainerStarted","Data":"a307b469267fedcab9ee6bd4faedfbf630052bf796da57d31dd7484c28d33d8c"} Apr 16 17:41:30.562971 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:30.562400 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a3188937-155b-4c43-acda-ef56b6d9499b","Type":"ContainerStarted","Data":"a26b9a8c8dbdc372347a8e6fca8624fe63cb7e93a729046a4fb72170540b8149"} Apr 16 17:41:30.564192 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:30.564136 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ftrk9" event={"ID":"fca7dc6e-d365-471c-9ee4-67a409ac1c9e","Type":"ContainerStarted","Data":"2b02ddc413318aa09f6c4d539b36f416c61b01f066d34f11ed30ac83b6ada16a"} Apr 16 17:41:30.598009 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:30.597456 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.805013278 podStartE2EDuration="6.597436302s" podCreationTimestamp="2026-04-16 17:41:24 +0000 UTC" firstStartedPulling="2026-04-16 17:41:26.539068177 +0000 UTC m=+59.913242584" lastFinishedPulling="2026-04-16 17:41:29.331491202 +0000 UTC m=+62.705665608" observedRunningTime="2026-04-16 17:41:30.595461364 +0000 UTC m=+63.969635792" watchObservedRunningTime="2026-04-16 17:41:30.597436302 +0000 UTC m=+63.971610732" Apr 16 17:41:31.568172 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:31.568128 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ftrk9" event={"ID":"fca7dc6e-d365-471c-9ee4-67a409ac1c9e","Type":"ContainerStarted","Data":"636dcf1cb1332a61782adccbab28872d985228575a1443dc3922aa65c517a8ea"} Apr 16 17:41:31.598134 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:31.598080 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ftrk9" podStartSLOduration=1.9107768630000002 podStartE2EDuration="4.598063895s" podCreationTimestamp="2026-04-16 17:41:27 +0000 UTC" firstStartedPulling="2026-04-16 17:41:28.259403159 +0000 UTC m=+61.633577566" lastFinishedPulling="2026-04-16 17:41:30.946690185 +0000 UTC m=+64.320864598" observedRunningTime="2026-04-16 17:41:31.597627951 +0000 UTC m=+64.971802378" watchObservedRunningTime="2026-04-16 17:41:31.598063895 +0000 UTC m=+64.972238323" Apr 16 17:41:32.549485 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:32.549460 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-fd5fbbd54-tdvhw" Apr 16 17:41:33.043624 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.043583 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:33.045908 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.045878 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:41:33.057324 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.057255 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75516b17-54a7-403c-b9a7-20ae8a32ebb7-metrics-certs\") pod \"network-metrics-daemon-2st9k\" (UID: \"75516b17-54a7-403c-b9a7-20ae8a32ebb7\") " pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:33.069434 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.069407 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdxwt\"" Apr 16 17:41:33.077410 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.077375 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2st9k" Apr 16 17:41:33.146094 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.145635 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:33.148090 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.147894 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:41:33.162598 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.162406 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:41:33.169984 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.169960 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwk69\" (UniqueName: \"kubernetes.io/projected/b60522d0-bdd5-4710-961d-66c6a6112265-kube-api-access-hwk69\") pod \"network-check-target-hnxw4\" (UID: \"b60522d0-bdd5-4710-961d-66c6a6112265\") " pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:33.221614 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.221576 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2st9k"] Apr 16 17:41:33.224705 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:33.224677 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75516b17_54a7_403c_b9a7_20ae8a32ebb7.slice/crio-0a73f7ce12bae6d8d394543c4b01749c5c6a28bb85860e3681d175d709a267f9 WatchSource:0}: Error finding container 0a73f7ce12bae6d8d394543c4b01749c5c6a28bb85860e3681d175d709a267f9: Status 404 returned error can't find the container with id 0a73f7ce12bae6d8d394543c4b01749c5c6a28bb85860e3681d175d709a267f9 Apr 16 17:41:33.472061 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.472027 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvrx4\"" Apr 16 17:41:33.480189 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.480145 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:33.577465 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.577424 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2st9k" event={"ID":"75516b17-54a7-403c-b9a7-20ae8a32ebb7","Type":"ContainerStarted","Data":"0a73f7ce12bae6d8d394543c4b01749c5c6a28bb85860e3681d175d709a267f9"} Apr 16 17:41:33.603051 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:33.602864 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hnxw4"] Apr 16 17:41:33.605579 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:41:33.605554 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb60522d0_bdd5_4710_961d_66c6a6112265.slice/crio-9068555b76a691e524eee5def558d7c53ea137bef71be2914c9bc94df044acb7 WatchSource:0}: Error finding container 9068555b76a691e524eee5def558d7c53ea137bef71be2914c9bc94df044acb7: Status 404 returned error can't find the container with id 9068555b76a691e524eee5def558d7c53ea137bef71be2914c9bc94df044acb7 Apr 16 17:41:34.121378 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:34.121337 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:34.121848 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:34.121394 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:34.127524 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:34.127498 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:34.510574 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:34.510538 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:34.583843 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:34.583788 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2st9k" event={"ID":"75516b17-54a7-403c-b9a7-20ae8a32ebb7","Type":"ContainerStarted","Data":"1d8f5e6f85c058d6f581dbdb7fc0c608e2a6fbccf8858ff7cf9dc14777316ba2"} Apr 16 17:41:34.585128 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:34.585098 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hnxw4" event={"ID":"b60522d0-bdd5-4710-961d-66c6a6112265","Type":"ContainerStarted","Data":"9068555b76a691e524eee5def558d7c53ea137bef71be2914c9bc94df044acb7"} Apr 16 17:41:34.590570 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:34.590365 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:41:35.591438 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:35.591398 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2st9k" event={"ID":"75516b17-54a7-403c-b9a7-20ae8a32ebb7","Type":"ContainerStarted","Data":"6f73e87f544aa97191b1794f86304f5c0f2fdee7a75e593673cbc75aabe61da5"} Apr 16 17:41:35.611028 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:35.610956 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2st9k" podStartSLOduration=67.518294722 podStartE2EDuration="1m8.610939036s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:41:33.226651144 +0000 UTC m=+66.600825550" lastFinishedPulling="2026-04-16 17:41:34.319295444 +0000 UTC m=+67.693469864" observedRunningTime="2026-04-16 17:41:35.607975375 +0000 UTC m=+68.982149816" watchObservedRunningTime="2026-04-16 17:41:35.610939036 +0000 UTC m=+68.985113512" Apr 16 17:41:36.597171 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:36.597119 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hnxw4" event={"ID":"b60522d0-bdd5-4710-961d-66c6a6112265","Type":"ContainerStarted","Data":"40ceb58e7b8518e849d0433c50d00a58d371a3f024a6311241b82571223a7e05"} Apr 16 17:41:36.597522 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:36.597443 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:41:36.615207 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:36.615138 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hnxw4" podStartSLOduration=66.814273418 podStartE2EDuration="1m9.615124494s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:41:33.607363488 +0000 UTC m=+66.981537894" lastFinishedPulling="2026-04-16 17:41:36.40821456 +0000 UTC m=+69.782388970" observedRunningTime="2026-04-16 17:41:36.614468184 +0000 UTC m=+69.988642612" watchObservedRunningTime="2026-04-16 17:41:36.615124494 +0000 UTC m=+69.989298921" Apr 16 17:41:42.656892 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:42.656755 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:42.656892 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:42.656796 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:41:48.780014 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:48.779955 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-548656dbdd-jjqn5" podUID="d1d5cf3a-53b4-4373-b554-3935ed9f47d6" containerName="console" containerID="cri-o://dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48" gracePeriod=15 Apr 16 17:41:49.029513 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.029486 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-548656dbdd-jjqn5_d1d5cf3a-53b4-4373-b554-3935ed9f47d6/console/0.log" Apr 16 17:41:49.029645 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.029546 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:49.183525 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.183494 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdvjx\" (UniqueName: \"kubernetes.io/projected/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-kube-api-access-tdvjx\") pod \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " Apr 16 17:41:49.183677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.183551 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-trusted-ca-bundle\") pod \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " Apr 16 17:41:49.183677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.183572 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-oauth-config\") pod \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " Apr 16 17:41:49.183677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.183597 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-serving-cert\") pod \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " Apr 16 17:41:49.183797 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.183728 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-config\") pod \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " Apr 16 17:41:49.183797 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.183781 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-oauth-serving-cert\") pod \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " Apr 16 17:41:49.183877 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.183813 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-service-ca\") pod \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\" (UID: \"d1d5cf3a-53b4-4373-b554-3935ed9f47d6\") " Apr 16 17:41:49.184241 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.184132 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d1d5cf3a-53b4-4373-b554-3935ed9f47d6" (UID: "d1d5cf3a-53b4-4373-b554-3935ed9f47d6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:41:49.184241 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.184220 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-config" (OuterVolumeSpecName: "console-config") pod "d1d5cf3a-53b4-4373-b554-3935ed9f47d6" (UID: "d1d5cf3a-53b4-4373-b554-3935ed9f47d6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:41:49.184596 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.184250 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d1d5cf3a-53b4-4373-b554-3935ed9f47d6" (UID: "d1d5cf3a-53b4-4373-b554-3935ed9f47d6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:41:49.184596 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.184329 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-service-ca" (OuterVolumeSpecName: "service-ca") pod "d1d5cf3a-53b4-4373-b554-3935ed9f47d6" (UID: "d1d5cf3a-53b4-4373-b554-3935ed9f47d6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:41:49.186059 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.186033 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d1d5cf3a-53b4-4373-b554-3935ed9f47d6" (UID: "d1d5cf3a-53b4-4373-b554-3935ed9f47d6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:41:49.186151 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.186086 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d1d5cf3a-53b4-4373-b554-3935ed9f47d6" (UID: "d1d5cf3a-53b4-4373-b554-3935ed9f47d6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:41:49.186151 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.186102 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-kube-api-access-tdvjx" (OuterVolumeSpecName: "kube-api-access-tdvjx") pod "d1d5cf3a-53b4-4373-b554-3935ed9f47d6" (UID: "d1d5cf3a-53b4-4373-b554-3935ed9f47d6"). InnerVolumeSpecName "kube-api-access-tdvjx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:41:49.285428 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.285393 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdvjx\" (UniqueName: \"kubernetes.io/projected/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-kube-api-access-tdvjx\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:41:49.285428 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.285422 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-trusted-ca-bundle\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:41:49.285428 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.285432 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-oauth-config\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:41:49.285634 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.285440 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-serving-cert\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:41:49.285634 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.285449 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-console-config\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:41:49.285634 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.285458 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-oauth-serving-cert\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:41:49.285634 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.285466 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1d5cf3a-53b4-4373-b554-3935ed9f47d6-service-ca\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:41:49.636072 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.635997 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-548656dbdd-jjqn5_d1d5cf3a-53b4-4373-b554-3935ed9f47d6/console/0.log" Apr 16 17:41:49.636072 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.636035 2580 generic.go:358] "Generic (PLEG): container finished" podID="d1d5cf3a-53b4-4373-b554-3935ed9f47d6" containerID="dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48" exitCode=2 Apr 16 17:41:49.636282 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.636094 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-548656dbdd-jjqn5" event={"ID":"d1d5cf3a-53b4-4373-b554-3935ed9f47d6","Type":"ContainerDied","Data":"dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48"} Apr 16 17:41:49.636282 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.636100 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-548656dbdd-jjqn5" Apr 16 17:41:49.636282 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.636121 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-548656dbdd-jjqn5" event={"ID":"d1d5cf3a-53b4-4373-b554-3935ed9f47d6","Type":"ContainerDied","Data":"030227eecaa8fbce9b65912691579f0b8d51909d32377bf96f79ae681300e995"} Apr 16 17:41:49.636282 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.636136 2580 scope.go:117] "RemoveContainer" containerID="dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48" Apr 16 17:41:49.644811 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.644795 2580 scope.go:117] "RemoveContainer" containerID="dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48" Apr 16 17:41:49.645141 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:41:49.645112 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48\": container with ID starting with dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48 not found: ID does not exist" containerID="dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48" Apr 16 17:41:49.645232 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.645172 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48"} err="failed to get container status \"dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48\": rpc error: code = NotFound desc = could not find container \"dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48\": container with ID starting with dea326683a39b02cac91db4cdaca0ed75f529687d21538bc33be4e5f4f66dc48 not found: ID does not exist" Apr 16 17:41:49.656468 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.656416 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-548656dbdd-jjqn5"] Apr 16 17:41:49.657638 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:49.657621 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-548656dbdd-jjqn5"] Apr 16 17:41:51.263295 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:41:51.263266 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1d5cf3a-53b4-4373-b554-3935ed9f47d6" path="/var/lib/kubelet/pods/d1d5cf3a-53b4-4373-b554-3935ed9f47d6/volumes" Apr 16 17:42:02.661664 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:02.661632 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:42:02.665514 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:02.665493 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-77bdf9f56d-5n2h8" Apr 16 17:42:07.603482 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:07.603444 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hnxw4" Apr 16 17:42:29.652720 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:29.652678 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:29.672249 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:29.672223 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:29.779678 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:29.779382 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:45.850873 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.850838 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2"] Apr 16 17:42:45.851365 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.851283 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1d5cf3a-53b4-4373-b554-3935ed9f47d6" containerName="console" Apr 16 17:42:45.851365 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.851296 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d5cf3a-53b4-4373-b554-3935ed9f47d6" containerName="console" Apr 16 17:42:45.851365 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.851358 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1d5cf3a-53b4-4373-b554-3935ed9f47d6" containerName="console" Apr 16 17:42:45.854429 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.854414 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.859496 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.859461 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 17:42:45.859615 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.859569 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 17:42:45.859702 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.859676 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 17:42:45.859818 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.859807 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-lcsrw\"" Apr 16 17:42:45.859909 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.859895 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 17:42:45.859961 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.859907 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 17:42:45.865890 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.865874 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 17:42:45.870314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.870293 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2"] Apr 16 17:42:45.885020 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.884981 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-metrics-client-ca\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.885201 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.885032 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-federate-client-tls\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.885201 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.885125 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.885201 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.885181 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-serving-certs-ca-bundle\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.885378 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.885214 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-telemeter-client-tls\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.885378 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.885251 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psdr6\" (UniqueName: \"kubernetes.io/projected/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-kube-api-access-psdr6\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.885482 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.885389 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-secret-telemeter-client\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.885482 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.885427 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.986008 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.985952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psdr6\" (UniqueName: \"kubernetes.io/projected/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-kube-api-access-psdr6\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.986220 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.986037 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-secret-telemeter-client\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.986220 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.986081 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.986220 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.986113 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-metrics-client-ca\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.986220 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.986141 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-federate-client-tls\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.986220 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.986209 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.986453 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.986317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-serving-certs-ca-bundle\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.986453 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.986348 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-telemeter-client-tls\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.987199 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.987117 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-serving-certs-ca-bundle\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.987199 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.987143 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.987368 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.987290 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-metrics-client-ca\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.988998 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.988976 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-federate-client-tls\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.989133 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.989039 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-telemeter-client-tls\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.989242 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.989226 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-secret-telemeter-client\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.989300 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.989272 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:45.998446 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:45.998426 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psdr6\" (UniqueName: \"kubernetes.io/projected/08e11f0a-18dd-44a4-ac13-9adb1ead4cfd-kube-api-access-psdr6\") pod \"telemeter-client-55ccfbc5fd-sn2b2\" (UID: \"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd\") " pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:46.164867 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:46.164783 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" Apr 16 17:42:46.294151 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:46.294127 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2"] Apr 16 17:42:46.296209 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:42:46.296183 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e11f0a_18dd_44a4_ac13_9adb1ead4cfd.slice/crio-414d3db262bca92214b21c87188f2e1185ea279f4c3aef8a6508a9dab51bc98b WatchSource:0}: Error finding container 414d3db262bca92214b21c87188f2e1185ea279f4c3aef8a6508a9dab51bc98b: Status 404 returned error can't find the container with id 414d3db262bca92214b21c87188f2e1185ea279f4c3aef8a6508a9dab51bc98b Apr 16 17:42:46.810440 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:46.810401 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" event={"ID":"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd","Type":"ContainerStarted","Data":"414d3db262bca92214b21c87188f2e1185ea279f4c3aef8a6508a9dab51bc98b"} Apr 16 17:42:48.817539 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:48.817506 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" event={"ID":"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd","Type":"ContainerStarted","Data":"ce5fc39cdf6adae4fedafa3f3d01cb455899d981ae3c888023733ec58845cffa"} Apr 16 17:42:48.817539 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:48.817541 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" event={"ID":"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd","Type":"ContainerStarted","Data":"30ec4b25d473f269108f1d971426a80c51b00bb21965fcc171bb3c3eba97aa2c"} Apr 16 17:42:48.817539 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:48.817550 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" event={"ID":"08e11f0a-18dd-44a4-ac13-9adb1ead4cfd","Type":"ContainerStarted","Data":"dd58985e1940ec69a2d6a07df137279c532a4e8271fa0c7d992009a554314984"} Apr 16 17:42:48.845034 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:42:48.844964 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-55ccfbc5fd-sn2b2" podStartSLOduration=2.166462499 podStartE2EDuration="3.844950702s" podCreationTimestamp="2026-04-16 17:42:45 +0000 UTC" firstStartedPulling="2026-04-16 17:42:46.297878388 +0000 UTC m=+139.672052797" lastFinishedPulling="2026-04-16 17:42:47.976366577 +0000 UTC m=+141.350541000" observedRunningTime="2026-04-16 17:42:48.844035671 +0000 UTC m=+142.218210103" watchObservedRunningTime="2026-04-16 17:42:48.844950702 +0000 UTC m=+142.219125130" Apr 16 17:43:08.344264 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:08.344219 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756b5d68f9-npcpq"] Apr 16 17:43:33.369038 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.368981 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-756b5d68f9-npcpq" podUID="6139c482-c4ae-4b18-b1d3-8809aca43004" containerName="console" containerID="cri-o://7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b" gracePeriod=15 Apr 16 17:43:33.610366 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.610342 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756b5d68f9-npcpq_6139c482-c4ae-4b18-b1d3-8809aca43004/console/0.log" Apr 16 17:43:33.610493 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.610407 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:43:33.720244 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720212 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw6tp\" (UniqueName: \"kubernetes.io/projected/6139c482-c4ae-4b18-b1d3-8809aca43004-kube-api-access-tw6tp\") pod \"6139c482-c4ae-4b18-b1d3-8809aca43004\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " Apr 16 17:43:33.720413 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720257 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-oauth-config\") pod \"6139c482-c4ae-4b18-b1d3-8809aca43004\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " Apr 16 17:43:33.720413 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720286 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-console-config\") pod \"6139c482-c4ae-4b18-b1d3-8809aca43004\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " Apr 16 17:43:33.720413 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720320 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-serving-cert\") pod \"6139c482-c4ae-4b18-b1d3-8809aca43004\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " Apr 16 17:43:33.720413 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720356 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-oauth-serving-cert\") pod \"6139c482-c4ae-4b18-b1d3-8809aca43004\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " Apr 16 17:43:33.720413 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720376 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-trusted-ca-bundle\") pod \"6139c482-c4ae-4b18-b1d3-8809aca43004\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " Apr 16 17:43:33.720413 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720408 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-service-ca\") pod \"6139c482-c4ae-4b18-b1d3-8809aca43004\" (UID: \"6139c482-c4ae-4b18-b1d3-8809aca43004\") " Apr 16 17:43:33.720771 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720746 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-console-config" (OuterVolumeSpecName: "console-config") pod "6139c482-c4ae-4b18-b1d3-8809aca43004" (UID: "6139c482-c4ae-4b18-b1d3-8809aca43004"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:43:33.720907 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720880 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6139c482-c4ae-4b18-b1d3-8809aca43004" (UID: "6139c482-c4ae-4b18-b1d3-8809aca43004"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:43:33.720952 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720897 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6139c482-c4ae-4b18-b1d3-8809aca43004" (UID: "6139c482-c4ae-4b18-b1d3-8809aca43004"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:43:33.720985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.720955 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-service-ca" (OuterVolumeSpecName: "service-ca") pod "6139c482-c4ae-4b18-b1d3-8809aca43004" (UID: "6139c482-c4ae-4b18-b1d3-8809aca43004"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:43:33.722619 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.722594 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6139c482-c4ae-4b18-b1d3-8809aca43004" (UID: "6139c482-c4ae-4b18-b1d3-8809aca43004"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:43:33.723080 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.723057 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6139c482-c4ae-4b18-b1d3-8809aca43004" (UID: "6139c482-c4ae-4b18-b1d3-8809aca43004"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:43:33.723080 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.723066 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6139c482-c4ae-4b18-b1d3-8809aca43004-kube-api-access-tw6tp" (OuterVolumeSpecName: "kube-api-access-tw6tp") pod "6139c482-c4ae-4b18-b1d3-8809aca43004" (UID: "6139c482-c4ae-4b18-b1d3-8809aca43004"). InnerVolumeSpecName "kube-api-access-tw6tp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:43:33.821472 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.821437 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-serving-cert\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:43:33.821472 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.821465 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-oauth-serving-cert\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:43:33.821472 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.821477 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-trusted-ca-bundle\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:43:33.821678 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.821487 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-service-ca\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:43:33.821678 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.821495 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tw6tp\" (UniqueName: \"kubernetes.io/projected/6139c482-c4ae-4b18-b1d3-8809aca43004-kube-api-access-tw6tp\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:43:33.821678 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.821503 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6139c482-c4ae-4b18-b1d3-8809aca43004-console-oauth-config\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:43:33.821678 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.821513 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6139c482-c4ae-4b18-b1d3-8809aca43004-console-config\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:43:33.949264 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.949241 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756b5d68f9-npcpq_6139c482-c4ae-4b18-b1d3-8809aca43004/console/0.log" Apr 16 17:43:33.949414 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.949277 2580 generic.go:358] "Generic (PLEG): container finished" podID="6139c482-c4ae-4b18-b1d3-8809aca43004" containerID="7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b" exitCode=2 Apr 16 17:43:33.949414 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.949340 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756b5d68f9-npcpq" event={"ID":"6139c482-c4ae-4b18-b1d3-8809aca43004","Type":"ContainerDied","Data":"7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b"} Apr 16 17:43:33.949414 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.949345 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756b5d68f9-npcpq" Apr 16 17:43:33.949414 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.949366 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756b5d68f9-npcpq" event={"ID":"6139c482-c4ae-4b18-b1d3-8809aca43004","Type":"ContainerDied","Data":"cc2d547a4ae66d867e258cb1c76c4d79847ca744b01209a0cf9b1f8d79f526cc"} Apr 16 17:43:33.949414 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.949383 2580 scope.go:117] "RemoveContainer" containerID="7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b" Apr 16 17:43:33.958044 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.958024 2580 scope.go:117] "RemoveContainer" containerID="7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b" Apr 16 17:43:33.958324 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:43:33.958304 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b\": container with ID starting with 7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b not found: ID does not exist" containerID="7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b" Apr 16 17:43:33.958371 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.958333 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b"} err="failed to get container status \"7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b\": rpc error: code = NotFound desc = could not find container \"7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b\": container with ID starting with 7714192c43a4f9cb5923c3b18e4c337fc71ec1e035c5c7bd20f7949cb660802b not found: ID does not exist" Apr 16 17:43:33.982652 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.982575 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756b5d68f9-npcpq"] Apr 16 17:43:33.989492 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:33.989464 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-756b5d68f9-npcpq"] Apr 16 17:43:35.262659 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:43:35.262620 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6139c482-c4ae-4b18-b1d3-8809aca43004" path="/var/lib/kubelet/pods/6139c482-c4ae-4b18-b1d3-8809aca43004/volumes" Apr 16 17:45:27.125366 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:45:27.125339 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 17:48:35.118339 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.118307 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lr22f"] Apr 16 17:48:35.118831 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.118640 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6139c482-c4ae-4b18-b1d3-8809aca43004" containerName="console" Apr 16 17:48:35.118831 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.118653 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6139c482-c4ae-4b18-b1d3-8809aca43004" containerName="console" Apr 16 17:48:35.118831 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.118703 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="6139c482-c4ae-4b18-b1d3-8809aca43004" containerName="console" Apr 16 17:48:35.121688 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.121672 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.124125 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.124109 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 17:48:35.139908 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.139882 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lr22f"] Apr 16 17:48:35.157751 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.157728 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-kubelet-config\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.157888 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.157762 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-dbus\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.157888 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.157798 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-original-pull-secret\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.258923 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.258887 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-kubelet-config\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.258923 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.258931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-dbus\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.259148 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.258968 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-original-pull-secret\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.259148 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.259019 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-kubelet-config\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.259148 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.259095 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-dbus\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.261581 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.261550 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68-original-pull-secret\") pod \"global-pull-secret-syncer-lr22f\" (UID: \"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68\") " pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.430467 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.430435 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr22f" Apr 16 17:48:35.552085 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.552059 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lr22f"] Apr 16 17:48:35.558023 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.558005 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:48:35.811811 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:35.811723 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lr22f" event={"ID":"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68","Type":"ContainerStarted","Data":"d9fb068a863c760437f6d329172204ef1eb59e80e1baf24f0d91e56ea46dc254"} Apr 16 17:48:39.824443 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:39.824346 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lr22f" event={"ID":"3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68","Type":"ContainerStarted","Data":"9df49dfd1804dd675aec080d07f24f1643459a7f24289da6ce6f4c6229ca02d4"} Apr 16 17:48:39.843312 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:48:39.843258 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lr22f" podStartSLOduration=0.931417123 podStartE2EDuration="4.843241948s" podCreationTimestamp="2026-04-16 17:48:35 +0000 UTC" firstStartedPulling="2026-04-16 17:48:35.558126014 +0000 UTC m=+488.932300420" lastFinishedPulling="2026-04-16 17:48:39.469950836 +0000 UTC m=+492.844125245" observedRunningTime="2026-04-16 17:48:39.841915268 +0000 UTC m=+493.216089690" watchObservedRunningTime="2026-04-16 17:48:39.843241948 +0000 UTC m=+493.217416413" Apr 16 17:50:38.876279 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:38.876244 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xdjpj"] Apr 16 17:50:38.879491 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:38.879475 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:38.882225 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:38.882200 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 17:50:38.882752 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:38.882726 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 17:50:38.883084 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:38.883061 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-8d7rb\"" Apr 16 17:50:38.893001 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:38.892978 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xdjpj"] Apr 16 17:50:39.034564 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:39.034524 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e85671ed-7943-4570-a6a8-4ac595256ae7-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xdjpj\" (UID: \"e85671ed-7943-4570-a6a8-4ac595256ae7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:39.034564 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:39.034565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vrvv\" (UniqueName: \"kubernetes.io/projected/e85671ed-7943-4570-a6a8-4ac595256ae7-kube-api-access-7vrvv\") pod \"cert-manager-webhook-597b96b99b-xdjpj\" (UID: \"e85671ed-7943-4570-a6a8-4ac595256ae7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:39.135677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:39.135583 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e85671ed-7943-4570-a6a8-4ac595256ae7-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xdjpj\" (UID: \"e85671ed-7943-4570-a6a8-4ac595256ae7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:39.135677 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:39.135618 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vrvv\" (UniqueName: \"kubernetes.io/projected/e85671ed-7943-4570-a6a8-4ac595256ae7-kube-api-access-7vrvv\") pod \"cert-manager-webhook-597b96b99b-xdjpj\" (UID: \"e85671ed-7943-4570-a6a8-4ac595256ae7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:39.149540 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:39.149504 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vrvv\" (UniqueName: \"kubernetes.io/projected/e85671ed-7943-4570-a6a8-4ac595256ae7-kube-api-access-7vrvv\") pod \"cert-manager-webhook-597b96b99b-xdjpj\" (UID: \"e85671ed-7943-4570-a6a8-4ac595256ae7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:39.149924 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:39.149906 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e85671ed-7943-4570-a6a8-4ac595256ae7-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xdjpj\" (UID: \"e85671ed-7943-4570-a6a8-4ac595256ae7\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:39.203958 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:39.203921 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:39.336480 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:39.336433 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xdjpj"] Apr 16 17:50:39.339419 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:50:39.339392 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85671ed_7943_4570_a6a8_4ac595256ae7.slice/crio-874f35e1b14b037db04941e2e2ed65dd08e9a028ab5bcbd062d9228d79e3e496 WatchSource:0}: Error finding container 874f35e1b14b037db04941e2e2ed65dd08e9a028ab5bcbd062d9228d79e3e496: Status 404 returned error can't find the container with id 874f35e1b14b037db04941e2e2ed65dd08e9a028ab5bcbd062d9228d79e3e496 Apr 16 17:50:40.085412 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.085381 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-dg7gt"] Apr 16 17:50:40.089863 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.089845 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" Apr 16 17:50:40.092083 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.092064 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-s67q6\"" Apr 16 17:50:40.099116 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.099094 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-dg7gt"] Apr 16 17:50:40.181424 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.181394 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" event={"ID":"e85671ed-7943-4570-a6a8-4ac595256ae7","Type":"ContainerStarted","Data":"874f35e1b14b037db04941e2e2ed65dd08e9a028ab5bcbd062d9228d79e3e496"} Apr 16 17:50:40.246852 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.246812 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71471c1d-872a-4ac8-82b1-902f586b54d4-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-dg7gt\" (UID: \"71471c1d-872a-4ac8-82b1-902f586b54d4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" Apr 16 17:50:40.247017 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.246922 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mhm2\" (UniqueName: \"kubernetes.io/projected/71471c1d-872a-4ac8-82b1-902f586b54d4-kube-api-access-5mhm2\") pod \"cert-manager-cainjector-8966b78d4-dg7gt\" (UID: \"71471c1d-872a-4ac8-82b1-902f586b54d4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" Apr 16 17:50:40.348411 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.348304 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71471c1d-872a-4ac8-82b1-902f586b54d4-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-dg7gt\" (UID: \"71471c1d-872a-4ac8-82b1-902f586b54d4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" Apr 16 17:50:40.348411 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.348392 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mhm2\" (UniqueName: \"kubernetes.io/projected/71471c1d-872a-4ac8-82b1-902f586b54d4-kube-api-access-5mhm2\") pod \"cert-manager-cainjector-8966b78d4-dg7gt\" (UID: \"71471c1d-872a-4ac8-82b1-902f586b54d4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" Apr 16 17:50:40.365700 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.365668 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mhm2\" (UniqueName: \"kubernetes.io/projected/71471c1d-872a-4ac8-82b1-902f586b54d4-kube-api-access-5mhm2\") pod \"cert-manager-cainjector-8966b78d4-dg7gt\" (UID: \"71471c1d-872a-4ac8-82b1-902f586b54d4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" Apr 16 17:50:40.370083 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.370051 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71471c1d-872a-4ac8-82b1-902f586b54d4-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-dg7gt\" (UID: \"71471c1d-872a-4ac8-82b1-902f586b54d4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" Apr 16 17:50:40.400068 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.400035 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" Apr 16 17:50:40.576897 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:40.576861 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-dg7gt"] Apr 16 17:50:41.186273 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:41.186216 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" event={"ID":"71471c1d-872a-4ac8-82b1-902f586b54d4","Type":"ContainerStarted","Data":"af5fa0032f388c8ef9d5039d74e391c0223536ed50e859499a8df465063e3053"} Apr 16 17:50:43.194116 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:43.194082 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" event={"ID":"e85671ed-7943-4570-a6a8-4ac595256ae7","Type":"ContainerStarted","Data":"80103391978a17eaf71ec1b7ad4ffd15d5cd9c9e93340db5d97c8380bc0d1454"} Apr 16 17:50:43.194561 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:43.194141 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:50:43.195493 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:43.195471 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" event={"ID":"71471c1d-872a-4ac8-82b1-902f586b54d4","Type":"ContainerStarted","Data":"6086aeb4fba77737992dad22cfde14da4c49580f208ffaaf43824573ec79cbae"} Apr 16 17:50:43.217128 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:43.217063 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" podStartSLOduration=1.817777132 podStartE2EDuration="5.21704442s" podCreationTimestamp="2026-04-16 17:50:38 +0000 UTC" firstStartedPulling="2026-04-16 17:50:39.341228576 +0000 UTC m=+612.715402989" lastFinishedPulling="2026-04-16 17:50:42.740495871 +0000 UTC m=+616.114670277" observedRunningTime="2026-04-16 17:50:43.215524463 +0000 UTC m=+616.589698892" watchObservedRunningTime="2026-04-16 17:50:43.21704442 +0000 UTC m=+616.591218850" Apr 16 17:50:43.235032 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:43.234982 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-dg7gt" podStartSLOduration=1.080579224 podStartE2EDuration="3.234966631s" podCreationTimestamp="2026-04-16 17:50:40 +0000 UTC" firstStartedPulling="2026-04-16 17:50:40.586819511 +0000 UTC m=+613.960993919" lastFinishedPulling="2026-04-16 17:50:42.741206921 +0000 UTC m=+616.115381326" observedRunningTime="2026-04-16 17:50:43.233299346 +0000 UTC m=+616.607473773" watchObservedRunningTime="2026-04-16 17:50:43.234966631 +0000 UTC m=+616.609141038" Apr 16 17:50:46.089713 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.089679 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-6vtpr"] Apr 16 17:50:46.093150 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.093131 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-6vtpr" Apr 16 17:50:46.095402 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.095379 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-6gfkn\"" Apr 16 17:50:46.106151 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.106130 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-6vtpr"] Apr 16 17:50:46.200905 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.200859 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r64v\" (UniqueName: \"kubernetes.io/projected/5f34cde1-5cbe-4906-865d-c1f1a303e846-kube-api-access-5r64v\") pod \"cert-manager-759f64656b-6vtpr\" (UID: \"5f34cde1-5cbe-4906-865d-c1f1a303e846\") " pod="cert-manager/cert-manager-759f64656b-6vtpr" Apr 16 17:50:46.200905 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.200910 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f34cde1-5cbe-4906-865d-c1f1a303e846-bound-sa-token\") pod \"cert-manager-759f64656b-6vtpr\" (UID: \"5f34cde1-5cbe-4906-865d-c1f1a303e846\") " pod="cert-manager/cert-manager-759f64656b-6vtpr" Apr 16 17:50:46.302401 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.302370 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f34cde1-5cbe-4906-865d-c1f1a303e846-bound-sa-token\") pod \"cert-manager-759f64656b-6vtpr\" (UID: \"5f34cde1-5cbe-4906-865d-c1f1a303e846\") " pod="cert-manager/cert-manager-759f64656b-6vtpr" Apr 16 17:50:46.302551 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.302469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r64v\" (UniqueName: \"kubernetes.io/projected/5f34cde1-5cbe-4906-865d-c1f1a303e846-kube-api-access-5r64v\") pod \"cert-manager-759f64656b-6vtpr\" (UID: \"5f34cde1-5cbe-4906-865d-c1f1a303e846\") " pod="cert-manager/cert-manager-759f64656b-6vtpr" Apr 16 17:50:46.312630 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.312593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5f34cde1-5cbe-4906-865d-c1f1a303e846-bound-sa-token\") pod \"cert-manager-759f64656b-6vtpr\" (UID: \"5f34cde1-5cbe-4906-865d-c1f1a303e846\") " pod="cert-manager/cert-manager-759f64656b-6vtpr" Apr 16 17:50:46.312748 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.312646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r64v\" (UniqueName: \"kubernetes.io/projected/5f34cde1-5cbe-4906-865d-c1f1a303e846-kube-api-access-5r64v\") pod \"cert-manager-759f64656b-6vtpr\" (UID: \"5f34cde1-5cbe-4906-865d-c1f1a303e846\") " pod="cert-manager/cert-manager-759f64656b-6vtpr" Apr 16 17:50:46.402754 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.402656 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-6vtpr" Apr 16 17:50:46.526603 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:46.526579 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-6vtpr"] Apr 16 17:50:46.531809 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:50:46.530834 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f34cde1_5cbe_4906_865d_c1f1a303e846.slice/crio-3208a39aaa79533ef2a730dd59595e414bbe2af22ba2c70256d5c884d3446c53 WatchSource:0}: Error finding container 3208a39aaa79533ef2a730dd59595e414bbe2af22ba2c70256d5c884d3446c53: Status 404 returned error can't find the container with id 3208a39aaa79533ef2a730dd59595e414bbe2af22ba2c70256d5c884d3446c53 Apr 16 17:50:47.208709 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:47.208671 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-6vtpr" event={"ID":"5f34cde1-5cbe-4906-865d-c1f1a303e846","Type":"ContainerStarted","Data":"bb4f51a406068635a237b3600386468f513697ecc34531cb1a0e216ddfecfa2a"} Apr 16 17:50:47.208709 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:47.208711 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-6vtpr" event={"ID":"5f34cde1-5cbe-4906-865d-c1f1a303e846","Type":"ContainerStarted","Data":"3208a39aaa79533ef2a730dd59595e414bbe2af22ba2c70256d5c884d3446c53"} Apr 16 17:50:47.232845 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:47.232621 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-6vtpr" podStartSLOduration=1.232603596 podStartE2EDuration="1.232603596s" podCreationTimestamp="2026-04-16 17:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:50:47.231853755 +0000 UTC m=+620.606028182" watchObservedRunningTime="2026-04-16 17:50:47.232603596 +0000 UTC m=+620.606778025" Apr 16 17:50:49.200676 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:50:49.200643 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-xdjpj" Apr 16 17:51:21.790492 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.790459 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk"] Apr 16 17:51:21.797339 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.797315 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:21.799862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.799828 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 17:51:21.799862 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.799852 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 17:51:21.800454 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.800436 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 17:51:21.800565 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.800475 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-zm8gf\"" Apr 16 17:51:21.800632 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.800590 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 17:51:21.800738 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.800716 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:51:21.805609 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.805516 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk"] Apr 16 17:51:21.930022 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.929993 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5236521-7026-4b2f-9668-1700617db059-metrics-cert\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:21.930247 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.930039 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl7w5\" (UniqueName: \"kubernetes.io/projected/d5236521-7026-4b2f-9668-1700617db059-kube-api-access-hl7w5\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:21.930247 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.930196 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5236521-7026-4b2f-9668-1700617db059-cert\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:21.930357 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:21.930254 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d5236521-7026-4b2f-9668-1700617db059-manager-config\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.030712 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.030674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5236521-7026-4b2f-9668-1700617db059-metrics-cert\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.030712 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.030718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl7w5\" (UniqueName: \"kubernetes.io/projected/d5236521-7026-4b2f-9668-1700617db059-kube-api-access-hl7w5\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.031020 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.030776 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5236521-7026-4b2f-9668-1700617db059-cert\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.031020 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.030809 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d5236521-7026-4b2f-9668-1700617db059-manager-config\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.031486 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.031469 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d5236521-7026-4b2f-9668-1700617db059-manager-config\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.033458 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.033435 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5236521-7026-4b2f-9668-1700617db059-metrics-cert\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.033554 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.033505 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5236521-7026-4b2f-9668-1700617db059-cert\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.041949 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.041880 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl7w5\" (UniqueName: \"kubernetes.io/projected/d5236521-7026-4b2f-9668-1700617db059-kube-api-access-hl7w5\") pod \"lws-controller-manager-64d875bb5b-pwcvk\" (UID: \"d5236521-7026-4b2f-9668-1700617db059\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.108137 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.108095 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:22.251623 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.251593 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk"] Apr 16 17:51:22.252700 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:51:22.252673 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5236521_7026_4b2f_9668_1700617db059.slice/crio-d4ae168919fc8f18a91f8459062f61acf5464a5284666503dff5ad82c8fa3403 WatchSource:0}: Error finding container d4ae168919fc8f18a91f8459062f61acf5464a5284666503dff5ad82c8fa3403: Status 404 returned error can't find the container with id d4ae168919fc8f18a91f8459062f61acf5464a5284666503dff5ad82c8fa3403 Apr 16 17:51:22.312832 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:22.312749 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" event={"ID":"d5236521-7026-4b2f-9668-1700617db059","Type":"ContainerStarted","Data":"d4ae168919fc8f18a91f8459062f61acf5464a5284666503dff5ad82c8fa3403"} Apr 16 17:51:25.324326 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:25.324286 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" event={"ID":"d5236521-7026-4b2f-9668-1700617db059","Type":"ContainerStarted","Data":"6feae934840bab6a8232c00363bed44aba0acc3ffa12a89fd4c31d9b0728f87c"} Apr 16 17:51:25.324691 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:25.324412 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:27.124632 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:51:27.124588 2580 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5236521_7026_4b2f_9668_1700617db059.slice/crio-d4ae168919fc8f18a91f8459062f61acf5464a5284666503dff5ad82c8fa3403\": RecentStats: unable to find data in memory cache]" Apr 16 17:51:36.330859 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:36.330823 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" Apr 16 17:51:36.349597 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:51:36.349545 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-pwcvk" podStartSLOduration=12.964462353 podStartE2EDuration="15.349530023s" podCreationTimestamp="2026-04-16 17:51:21 +0000 UTC" firstStartedPulling="2026-04-16 17:51:22.254392102 +0000 UTC m=+655.628566507" lastFinishedPulling="2026-04-16 17:51:24.639459771 +0000 UTC m=+658.013634177" observedRunningTime="2026-04-16 17:51:25.357994079 +0000 UTC m=+658.732168563" watchObservedRunningTime="2026-04-16 17:51:36.349530023 +0000 UTC m=+669.723704450" Apr 16 17:52:14.560362 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.560289 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6"] Apr 16 17:52:14.562490 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.562474 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" Apr 16 17:52:14.564560 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.564540 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 17:52:14.564697 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.564666 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-whnw2\"" Apr 16 17:52:14.564828 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.564809 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 17:52:14.565305 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.565291 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 17:52:14.577094 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.577069 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6"] Apr 16 17:52:14.688258 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.688219 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf6ld\" (UniqueName: \"kubernetes.io/projected/88629fdb-a326-4062-a0f7-caece9e92898-kube-api-access-nf6ld\") pod \"dns-operator-controller-manager-844548ff4c-m2mv6\" (UID: \"88629fdb-a326-4062-a0f7-caece9e92898\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" Apr 16 17:52:14.789111 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.789073 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf6ld\" (UniqueName: \"kubernetes.io/projected/88629fdb-a326-4062-a0f7-caece9e92898-kube-api-access-nf6ld\") pod \"dns-operator-controller-manager-844548ff4c-m2mv6\" (UID: \"88629fdb-a326-4062-a0f7-caece9e92898\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" Apr 16 17:52:14.799260 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.799235 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf6ld\" (UniqueName: \"kubernetes.io/projected/88629fdb-a326-4062-a0f7-caece9e92898-kube-api-access-nf6ld\") pod \"dns-operator-controller-manager-844548ff4c-m2mv6\" (UID: \"88629fdb-a326-4062-a0f7-caece9e92898\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" Apr 16 17:52:14.873328 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:14.873245 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" Apr 16 17:52:15.006249 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:15.006212 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6"] Apr 16 17:52:15.010034 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:52:15.009995 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88629fdb_a326_4062_a0f7_caece9e92898.slice/crio-56d5038ee1b9a5d5432e1bcc7986cfa3afa052d17746f8635de221df875a1400 WatchSource:0}: Error finding container 56d5038ee1b9a5d5432e1bcc7986cfa3afa052d17746f8635de221df875a1400: Status 404 returned error can't find the container with id 56d5038ee1b9a5d5432e1bcc7986cfa3afa052d17746f8635de221df875a1400 Apr 16 17:52:15.479987 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:15.479948 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" event={"ID":"88629fdb-a326-4062-a0f7-caece9e92898","Type":"ContainerStarted","Data":"56d5038ee1b9a5d5432e1bcc7986cfa3afa052d17746f8635de221df875a1400"} Apr 16 17:52:17.388547 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.388509 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cjqht"] Apr 16 17:52:17.391165 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.391138 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" Apr 16 17:52:17.394070 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.394037 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-jr8x9\"" Apr 16 17:52:17.403926 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.403881 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cjqht"] Apr 16 17:52:17.512982 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.512957 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h84x8\" (UniqueName: \"kubernetes.io/projected/56ad4f13-8e31-4272-a2b9-b99b022f311a-kube-api-access-h84x8\") pod \"authorino-operator-7587b89b76-cjqht\" (UID: \"56ad4f13-8e31-4272-a2b9-b99b022f311a\") " pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" Apr 16 17:52:17.614581 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.614553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h84x8\" (UniqueName: \"kubernetes.io/projected/56ad4f13-8e31-4272-a2b9-b99b022f311a-kube-api-access-h84x8\") pod \"authorino-operator-7587b89b76-cjqht\" (UID: \"56ad4f13-8e31-4272-a2b9-b99b022f311a\") " pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" Apr 16 17:52:17.631219 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.631187 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h84x8\" (UniqueName: \"kubernetes.io/projected/56ad4f13-8e31-4272-a2b9-b99b022f311a-kube-api-access-h84x8\") pod \"authorino-operator-7587b89b76-cjqht\" (UID: \"56ad4f13-8e31-4272-a2b9-b99b022f311a\") " pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" Apr 16 17:52:17.705987 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.705950 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" Apr 16 17:52:17.840078 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:17.840047 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-cjqht"] Apr 16 17:52:17.842476 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:52:17.842435 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ad4f13_8e31_4272_a2b9_b99b022f311a.slice/crio-06b20e90c1b43d7a24f5ab3f9c952b4d47c811660ef1a927ba00861bd6e1886f WatchSource:0}: Error finding container 06b20e90c1b43d7a24f5ab3f9c952b4d47c811660ef1a927ba00861bd6e1886f: Status 404 returned error can't find the container with id 06b20e90c1b43d7a24f5ab3f9c952b4d47c811660ef1a927ba00861bd6e1886f Apr 16 17:52:18.492558 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:18.492517 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" event={"ID":"88629fdb-a326-4062-a0f7-caece9e92898","Type":"ContainerStarted","Data":"38d614d7a9cdcd53f715fd2d54f4b19bcf44b43ecaf602858f45e411d43eaf59"} Apr 16 17:52:18.493049 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:18.492590 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" Apr 16 17:52:18.493928 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:18.493905 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" event={"ID":"56ad4f13-8e31-4272-a2b9-b99b022f311a","Type":"ContainerStarted","Data":"06b20e90c1b43d7a24f5ab3f9c952b4d47c811660ef1a927ba00861bd6e1886f"} Apr 16 17:52:18.527363 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:18.527302 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" podStartSLOduration=2.075263345 podStartE2EDuration="4.527282906s" podCreationTimestamp="2026-04-16 17:52:14 +0000 UTC" firstStartedPulling="2026-04-16 17:52:15.012823598 +0000 UTC m=+708.386998012" lastFinishedPulling="2026-04-16 17:52:17.464843168 +0000 UTC m=+710.839017573" observedRunningTime="2026-04-16 17:52:18.52486413 +0000 UTC m=+711.899038558" watchObservedRunningTime="2026-04-16 17:52:18.527282906 +0000 UTC m=+711.901457336" Apr 16 17:52:19.499150 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:19.499058 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" event={"ID":"56ad4f13-8e31-4272-a2b9-b99b022f311a","Type":"ContainerStarted","Data":"40462cc120c341e0977629c2206d3f5c3211792b6fe1fddd1fda255553050da7"} Apr 16 17:52:19.499548 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:19.499350 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" Apr 16 17:52:19.525250 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:19.525194 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" podStartSLOduration=1.134490568 podStartE2EDuration="2.525172238s" podCreationTimestamp="2026-04-16 17:52:17 +0000 UTC" firstStartedPulling="2026-04-16 17:52:17.844761972 +0000 UTC m=+711.218936378" lastFinishedPulling="2026-04-16 17:52:19.235443639 +0000 UTC m=+712.609618048" observedRunningTime="2026-04-16 17:52:19.52353001 +0000 UTC m=+712.897704440" watchObservedRunningTime="2026-04-16 17:52:19.525172238 +0000 UTC m=+712.899346658" Apr 16 17:52:29.501669 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:29.501630 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-m2mv6" Apr 16 17:52:30.505100 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:52:30.505071 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-cjqht" Apr 16 17:53:09.596989 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.596953 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-s69lj"] Apr 16 17:53:09.600484 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.600466 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:09.602569 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.602555 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-v8xg6\"" Apr 16 17:53:09.602633 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.602582 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 17:53:09.611384 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.611362 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-s69lj"] Apr 16 17:53:09.694603 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.694568 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-s69lj"] Apr 16 17:53:09.771258 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.771225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6xgj\" (UniqueName: \"kubernetes.io/projected/e8a69617-15eb-43df-b914-6c3d3e59ae1e-kube-api-access-f6xgj\") pod \"limitador-limitador-64c8f475fb-s69lj\" (UID: \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:09.771420 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.771305 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e8a69617-15eb-43df-b914-6c3d3e59ae1e-config-file\") pod \"limitador-limitador-64c8f475fb-s69lj\" (UID: \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:09.872188 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.872059 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6xgj\" (UniqueName: \"kubernetes.io/projected/e8a69617-15eb-43df-b914-6c3d3e59ae1e-kube-api-access-f6xgj\") pod \"limitador-limitador-64c8f475fb-s69lj\" (UID: \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:09.872188 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.872127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e8a69617-15eb-43df-b914-6c3d3e59ae1e-config-file\") pod \"limitador-limitador-64c8f475fb-s69lj\" (UID: \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:09.872785 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.872758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e8a69617-15eb-43df-b914-6c3d3e59ae1e-config-file\") pod \"limitador-limitador-64c8f475fb-s69lj\" (UID: \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:09.882058 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.882027 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6xgj\" (UniqueName: \"kubernetes.io/projected/e8a69617-15eb-43df-b914-6c3d3e59ae1e-kube-api-access-f6xgj\") pod \"limitador-limitador-64c8f475fb-s69lj\" (UID: \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:09.910936 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:09.910900 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:10.041796 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.041727 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-s69lj"] Apr 16 17:53:10.044377 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:53:10.044352 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a69617_15eb_43df_b914_6c3d3e59ae1e.slice/crio-b5e1d433907b3bce24a85e0efd2e35094a38c81efcdd9a927bcd75ebefd5d991 WatchSource:0}: Error finding container b5e1d433907b3bce24a85e0efd2e35094a38c81efcdd9a927bcd75ebefd5d991: Status 404 returned error can't find the container with id b5e1d433907b3bce24a85e0efd2e35094a38c81efcdd9a927bcd75ebefd5d991 Apr 16 17:53:10.481309 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.481271 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-fz2rj"] Apr 16 17:53:10.486900 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.486871 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-fz2rj" Apr 16 17:53:10.489857 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.489836 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-j2xjx\"" Apr 16 17:53:10.492437 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.492410 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-fz2rj"] Apr 16 17:53:10.579804 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.579762 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5czhq\" (UniqueName: \"kubernetes.io/projected/a97cfc21-8a57-4506-b86a-780c6f5869db-kube-api-access-5czhq\") pod \"authorino-674b59b84c-fz2rj\" (UID: \"a97cfc21-8a57-4506-b86a-780c6f5869db\") " pod="kuadrant-system/authorino-674b59b84c-fz2rj" Apr 16 17:53:10.681331 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.681299 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5czhq\" (UniqueName: \"kubernetes.io/projected/a97cfc21-8a57-4506-b86a-780c6f5869db-kube-api-access-5czhq\") pod \"authorino-674b59b84c-fz2rj\" (UID: \"a97cfc21-8a57-4506-b86a-780c6f5869db\") " pod="kuadrant-system/authorino-674b59b84c-fz2rj" Apr 16 17:53:10.686610 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.686574 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" event={"ID":"e8a69617-15eb-43df-b914-6c3d3e59ae1e","Type":"ContainerStarted","Data":"b5e1d433907b3bce24a85e0efd2e35094a38c81efcdd9a927bcd75ebefd5d991"} Apr 16 17:53:10.697621 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.697592 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5czhq\" (UniqueName: \"kubernetes.io/projected/a97cfc21-8a57-4506-b86a-780c6f5869db-kube-api-access-5czhq\") pod \"authorino-674b59b84c-fz2rj\" (UID: \"a97cfc21-8a57-4506-b86a-780c6f5869db\") " pod="kuadrant-system/authorino-674b59b84c-fz2rj" Apr 16 17:53:10.798736 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.798660 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-fz2rj" Apr 16 17:53:10.933824 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:10.933091 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-fz2rj"] Apr 16 17:53:10.937151 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:53:10.937062 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97cfc21_8a57_4506_b86a_780c6f5869db.slice/crio-6ffe9240ea878c7d596d6744a8e98bb59fa039c953c3623ea24f97868488ae41 WatchSource:0}: Error finding container 6ffe9240ea878c7d596d6744a8e98bb59fa039c953c3623ea24f97868488ae41: Status 404 returned error can't find the container with id 6ffe9240ea878c7d596d6744a8e98bb59fa039c953c3623ea24f97868488ae41 Apr 16 17:53:11.693037 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:11.693002 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-fz2rj" event={"ID":"a97cfc21-8a57-4506-b86a-780c6f5869db","Type":"ContainerStarted","Data":"6ffe9240ea878c7d596d6744a8e98bb59fa039c953c3623ea24f97868488ae41"} Apr 16 17:53:14.256225 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:14.256184 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-fz2rj"] Apr 16 17:53:15.714985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:15.714946 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" event={"ID":"e8a69617-15eb-43df-b914-6c3d3e59ae1e","Type":"ContainerStarted","Data":"0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852"} Apr 16 17:53:15.715446 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:15.715078 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:15.716253 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:15.716233 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-fz2rj" event={"ID":"a97cfc21-8a57-4506-b86a-780c6f5869db","Type":"ContainerStarted","Data":"bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012"} Apr 16 17:53:15.716356 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:15.716273 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-fz2rj" podUID="a97cfc21-8a57-4506-b86a-780c6f5869db" containerName="authorino" containerID="cri-o://bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012" gracePeriod=30 Apr 16 17:53:15.733881 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:15.733824 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" podStartSLOduration=1.51457423 podStartE2EDuration="6.733804254s" podCreationTimestamp="2026-04-16 17:53:09 +0000 UTC" firstStartedPulling="2026-04-16 17:53:10.046664251 +0000 UTC m=+763.420838665" lastFinishedPulling="2026-04-16 17:53:15.265894283 +0000 UTC m=+768.640068689" observedRunningTime="2026-04-16 17:53:15.733033983 +0000 UTC m=+769.107208404" watchObservedRunningTime="2026-04-16 17:53:15.733804254 +0000 UTC m=+769.107978689" Apr 16 17:53:15.954225 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:15.954205 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-fz2rj" Apr 16 17:53:16.135924 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.135826 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5czhq\" (UniqueName: \"kubernetes.io/projected/a97cfc21-8a57-4506-b86a-780c6f5869db-kube-api-access-5czhq\") pod \"a97cfc21-8a57-4506-b86a-780c6f5869db\" (UID: \"a97cfc21-8a57-4506-b86a-780c6f5869db\") " Apr 16 17:53:16.138183 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.138140 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97cfc21-8a57-4506-b86a-780c6f5869db-kube-api-access-5czhq" (OuterVolumeSpecName: "kube-api-access-5czhq") pod "a97cfc21-8a57-4506-b86a-780c6f5869db" (UID: "a97cfc21-8a57-4506-b86a-780c6f5869db"). InnerVolumeSpecName "kube-api-access-5czhq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:53:16.237327 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.237285 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5czhq\" (UniqueName: \"kubernetes.io/projected/a97cfc21-8a57-4506-b86a-780c6f5869db-kube-api-access-5czhq\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:53:16.720991 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.720954 2580 generic.go:358] "Generic (PLEG): container finished" podID="a97cfc21-8a57-4506-b86a-780c6f5869db" containerID="bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012" exitCode=0 Apr 16 17:53:16.721421 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.721001 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-fz2rj" Apr 16 17:53:16.721421 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.721037 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-fz2rj" event={"ID":"a97cfc21-8a57-4506-b86a-780c6f5869db","Type":"ContainerDied","Data":"bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012"} Apr 16 17:53:16.721421 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.721066 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-fz2rj" event={"ID":"a97cfc21-8a57-4506-b86a-780c6f5869db","Type":"ContainerDied","Data":"6ffe9240ea878c7d596d6744a8e98bb59fa039c953c3623ea24f97868488ae41"} Apr 16 17:53:16.721421 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.721080 2580 scope.go:117] "RemoveContainer" containerID="bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012" Apr 16 17:53:16.729299 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.729286 2580 scope.go:117] "RemoveContainer" containerID="bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012" Apr 16 17:53:16.729528 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:53:16.729511 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012\": container with ID starting with bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012 not found: ID does not exist" containerID="bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012" Apr 16 17:53:16.729571 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.729537 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012"} err="failed to get container status \"bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012\": rpc error: code = NotFound desc = could not find container \"bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012\": container with ID starting with bb314754b67c0aa14bd406c5403d506d0adea0cabd7f9ad7ff7a9bc1828f3012 not found: ID does not exist" Apr 16 17:53:16.742867 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.742849 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-fz2rj"] Apr 16 17:53:16.747437 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:16.747417 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-fz2rj"] Apr 16 17:53:17.263934 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:17.263904 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a97cfc21-8a57-4506-b86a-780c6f5869db" path="/var/lib/kubelet/pods/a97cfc21-8a57-4506-b86a-780c6f5869db/volumes" Apr 16 17:53:22.989185 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:22.989121 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-s69lj"] Apr 16 17:53:22.989657 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:22.989452 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" podUID="e8a69617-15eb-43df-b914-6c3d3e59ae1e" containerName="limitador" containerID="cri-o://0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852" gracePeriod=30 Apr 16 17:53:22.991685 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:22.991663 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:23.525479 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.525459 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:23.598985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.598894 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6xgj\" (UniqueName: \"kubernetes.io/projected/e8a69617-15eb-43df-b914-6c3d3e59ae1e-kube-api-access-f6xgj\") pod \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\" (UID: \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\") " Apr 16 17:53:23.598985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.598957 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e8a69617-15eb-43df-b914-6c3d3e59ae1e-config-file\") pod \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\" (UID: \"e8a69617-15eb-43df-b914-6c3d3e59ae1e\") " Apr 16 17:53:23.599392 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.599370 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a69617-15eb-43df-b914-6c3d3e59ae1e-config-file" (OuterVolumeSpecName: "config-file") pod "e8a69617-15eb-43df-b914-6c3d3e59ae1e" (UID: "e8a69617-15eb-43df-b914-6c3d3e59ae1e"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:53:23.601203 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.601181 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a69617-15eb-43df-b914-6c3d3e59ae1e-kube-api-access-f6xgj" (OuterVolumeSpecName: "kube-api-access-f6xgj") pod "e8a69617-15eb-43df-b914-6c3d3e59ae1e" (UID: "e8a69617-15eb-43df-b914-6c3d3e59ae1e"). InnerVolumeSpecName "kube-api-access-f6xgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:53:23.699696 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.699657 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6xgj\" (UniqueName: \"kubernetes.io/projected/e8a69617-15eb-43df-b914-6c3d3e59ae1e-kube-api-access-f6xgj\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:53:23.699696 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.699688 2580 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e8a69617-15eb-43df-b914-6c3d3e59ae1e-config-file\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:53:23.745806 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.745774 2580 generic.go:358] "Generic (PLEG): container finished" podID="e8a69617-15eb-43df-b914-6c3d3e59ae1e" containerID="0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852" exitCode=0 Apr 16 17:53:23.745954 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.745839 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" event={"ID":"e8a69617-15eb-43df-b914-6c3d3e59ae1e","Type":"ContainerDied","Data":"0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852"} Apr 16 17:53:23.745954 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.745846 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" Apr 16 17:53:23.745954 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.745873 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-s69lj" event={"ID":"e8a69617-15eb-43df-b914-6c3d3e59ae1e","Type":"ContainerDied","Data":"b5e1d433907b3bce24a85e0efd2e35094a38c81efcdd9a927bcd75ebefd5d991"} Apr 16 17:53:23.745954 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.745893 2580 scope.go:117] "RemoveContainer" containerID="0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852" Apr 16 17:53:23.754382 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.754367 2580 scope.go:117] "RemoveContainer" containerID="0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852" Apr 16 17:53:23.754635 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:53:23.754620 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852\": container with ID starting with 0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852 not found: ID does not exist" containerID="0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852" Apr 16 17:53:23.754669 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.754643 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852"} err="failed to get container status \"0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852\": rpc error: code = NotFound desc = could not find container \"0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852\": container with ID starting with 0b23dd58083999817e435621db0e78b75cacc50c75f7bad61a846cb2cc72f852 not found: ID does not exist" Apr 16 17:53:23.773340 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.773311 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-s69lj"] Apr 16 17:53:23.778108 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:23.778084 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-s69lj"] Apr 16 17:53:25.263262 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:25.263225 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a69617-15eb-43df-b914-6c3d3e59ae1e" path="/var/lib/kubelet/pods/e8a69617-15eb-43df-b914-6c3d3e59ae1e/volumes" Apr 16 17:53:32.677334 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.677296 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-cjfrq"] Apr 16 17:53:32.677708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.677684 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a97cfc21-8a57-4506-b86a-780c6f5869db" containerName="authorino" Apr 16 17:53:32.677708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.677696 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97cfc21-8a57-4506-b86a-780c6f5869db" containerName="authorino" Apr 16 17:53:32.677775 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.677717 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8a69617-15eb-43df-b914-6c3d3e59ae1e" containerName="limitador" Apr 16 17:53:32.677775 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.677723 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a69617-15eb-43df-b914-6c3d3e59ae1e" containerName="limitador" Apr 16 17:53:32.677834 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.677779 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8a69617-15eb-43df-b914-6c3d3e59ae1e" containerName="limitador" Apr 16 17:53:32.677834 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.677792 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a97cfc21-8a57-4506-b86a-780c6f5869db" containerName="authorino" Apr 16 17:53:32.682599 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.682579 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-cjfrq" Apr 16 17:53:32.685415 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.685395 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-j2xjx\"" Apr 16 17:53:32.685542 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.685454 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 17:53:32.692950 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.692921 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-cjfrq"] Apr 16 17:53:32.777522 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.777493 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/34906abf-3955-483a-a0e0-0732c2dd0a23-tls-cert\") pod \"authorino-68bd676465-cjfrq\" (UID: \"34906abf-3955-483a-a0e0-0732c2dd0a23\") " pod="kuadrant-system/authorino-68bd676465-cjfrq" Apr 16 17:53:32.777732 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.777569 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxlvg\" (UniqueName: \"kubernetes.io/projected/34906abf-3955-483a-a0e0-0732c2dd0a23-kube-api-access-kxlvg\") pod \"authorino-68bd676465-cjfrq\" (UID: \"34906abf-3955-483a-a0e0-0732c2dd0a23\") " pod="kuadrant-system/authorino-68bd676465-cjfrq" Apr 16 17:53:32.878577 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.878535 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/34906abf-3955-483a-a0e0-0732c2dd0a23-tls-cert\") pod \"authorino-68bd676465-cjfrq\" (UID: \"34906abf-3955-483a-a0e0-0732c2dd0a23\") " pod="kuadrant-system/authorino-68bd676465-cjfrq" Apr 16 17:53:32.878770 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.878589 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxlvg\" (UniqueName: \"kubernetes.io/projected/34906abf-3955-483a-a0e0-0732c2dd0a23-kube-api-access-kxlvg\") pod \"authorino-68bd676465-cjfrq\" (UID: \"34906abf-3955-483a-a0e0-0732c2dd0a23\") " pod="kuadrant-system/authorino-68bd676465-cjfrq" Apr 16 17:53:32.881172 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.881134 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/34906abf-3955-483a-a0e0-0732c2dd0a23-tls-cert\") pod \"authorino-68bd676465-cjfrq\" (UID: \"34906abf-3955-483a-a0e0-0732c2dd0a23\") " pod="kuadrant-system/authorino-68bd676465-cjfrq" Apr 16 17:53:32.891681 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.891659 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxlvg\" (UniqueName: \"kubernetes.io/projected/34906abf-3955-483a-a0e0-0732c2dd0a23-kube-api-access-kxlvg\") pod \"authorino-68bd676465-cjfrq\" (UID: \"34906abf-3955-483a-a0e0-0732c2dd0a23\") " pod="kuadrant-system/authorino-68bd676465-cjfrq" Apr 16 17:53:32.991839 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:32.991757 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-cjfrq" Apr 16 17:53:33.118632 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:33.118561 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-cjfrq"] Apr 16 17:53:33.120917 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:53:33.120882 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34906abf_3955_483a_a0e0_0732c2dd0a23.slice/crio-cc9e4f156e38502e4c86c99079731761da6d60f20c05122ee0f5c6cf0029cbaa WatchSource:0}: Error finding container cc9e4f156e38502e4c86c99079731761da6d60f20c05122ee0f5c6cf0029cbaa: Status 404 returned error can't find the container with id cc9e4f156e38502e4c86c99079731761da6d60f20c05122ee0f5c6cf0029cbaa Apr 16 17:53:33.785054 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:33.785025 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-cjfrq" event={"ID":"34906abf-3955-483a-a0e0-0732c2dd0a23","Type":"ContainerStarted","Data":"cc9e4f156e38502e4c86c99079731761da6d60f20c05122ee0f5c6cf0029cbaa"} Apr 16 17:53:34.789616 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:34.789584 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-cjfrq" event={"ID":"34906abf-3955-483a-a0e0-0732c2dd0a23","Type":"ContainerStarted","Data":"de1851f05ec6dad1739de64b9d025b3235364b176be41adb7b92923bce26f6c1"} Apr 16 17:53:34.812363 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:34.812277 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-cjfrq" podStartSLOduration=2.187730047 podStartE2EDuration="2.812258639s" podCreationTimestamp="2026-04-16 17:53:32 +0000 UTC" firstStartedPulling="2026-04-16 17:53:33.122130095 +0000 UTC m=+786.496304505" lastFinishedPulling="2026-04-16 17:53:33.746658688 +0000 UTC m=+787.120833097" observedRunningTime="2026-04-16 17:53:34.812007925 +0000 UTC m=+788.186182352" watchObservedRunningTime="2026-04-16 17:53:34.812258639 +0000 UTC m=+788.186433048" Apr 16 17:53:53.468513 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.468434 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-f7wss"] Apr 16 17:53:53.472114 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.472094 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:53.474633 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.474613 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 17:53:53.475317 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.475298 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 17:53:53.475394 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.475315 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 17:53:53.475394 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.475298 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-w8v4r\"" Apr 16 17:53:53.482759 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.482736 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-f7wss"] Apr 16 17:53:53.529335 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.529306 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-n8vbf"] Apr 16 17:53:53.533594 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.533570 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:53.537419 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.537398 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 17:53:53.537542 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.537510 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6w4xh\"" Apr 16 17:53:53.544634 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.544615 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-n8vbf"] Apr 16 17:53:53.572485 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.572456 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1bd190d-225f-47dd-a61d-73c6352f7eb8-cert\") pod \"kserve-controller-manager-7f8f4564d-f7wss\" (UID: \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\") " pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:53.572636 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.572505 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwvms\" (UniqueName: \"kubernetes.io/projected/f1bd190d-225f-47dd-a61d-73c6352f7eb8-kube-api-access-xwvms\") pod \"kserve-controller-manager-7f8f4564d-f7wss\" (UID: \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\") " pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:53.673528 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.673497 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b7450c29-7c4d-4529-aa0e-811267d2a02c-data\") pod \"seaweedfs-86cc847c5c-n8vbf\" (UID: \"b7450c29-7c4d-4529-aa0e-811267d2a02c\") " pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:53.673696 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.673630 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx4qq\" (UniqueName: \"kubernetes.io/projected/b7450c29-7c4d-4529-aa0e-811267d2a02c-kube-api-access-wx4qq\") pod \"seaweedfs-86cc847c5c-n8vbf\" (UID: \"b7450c29-7c4d-4529-aa0e-811267d2a02c\") " pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:53.673696 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.673668 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1bd190d-225f-47dd-a61d-73c6352f7eb8-cert\") pod \"kserve-controller-manager-7f8f4564d-f7wss\" (UID: \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\") " pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:53.673771 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.673701 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwvms\" (UniqueName: \"kubernetes.io/projected/f1bd190d-225f-47dd-a61d-73c6352f7eb8-kube-api-access-xwvms\") pod \"kserve-controller-manager-7f8f4564d-f7wss\" (UID: \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\") " pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:53.676274 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.676256 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1bd190d-225f-47dd-a61d-73c6352f7eb8-cert\") pod \"kserve-controller-manager-7f8f4564d-f7wss\" (UID: \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\") " pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:53.683851 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.683829 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwvms\" (UniqueName: \"kubernetes.io/projected/f1bd190d-225f-47dd-a61d-73c6352f7eb8-kube-api-access-xwvms\") pod \"kserve-controller-manager-7f8f4564d-f7wss\" (UID: \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\") " pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:53.775327 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.775245 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx4qq\" (UniqueName: \"kubernetes.io/projected/b7450c29-7c4d-4529-aa0e-811267d2a02c-kube-api-access-wx4qq\") pod \"seaweedfs-86cc847c5c-n8vbf\" (UID: \"b7450c29-7c4d-4529-aa0e-811267d2a02c\") " pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:53.775327 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.775310 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b7450c29-7c4d-4529-aa0e-811267d2a02c-data\") pod \"seaweedfs-86cc847c5c-n8vbf\" (UID: \"b7450c29-7c4d-4529-aa0e-811267d2a02c\") " pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:53.775745 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.775725 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b7450c29-7c4d-4529-aa0e-811267d2a02c-data\") pod \"seaweedfs-86cc847c5c-n8vbf\" (UID: \"b7450c29-7c4d-4529-aa0e-811267d2a02c\") " pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:53.783103 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.783077 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:53.785326 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.785304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx4qq\" (UniqueName: \"kubernetes.io/projected/b7450c29-7c4d-4529-aa0e-811267d2a02c-kube-api-access-wx4qq\") pod \"seaweedfs-86cc847c5c-n8vbf\" (UID: \"b7450c29-7c4d-4529-aa0e-811267d2a02c\") " pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:53.844415 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.844380 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:53.916092 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.916064 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-f7wss"] Apr 16 17:53:53.918291 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:53:53.918264 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1bd190d_225f_47dd_a61d_73c6352f7eb8.slice/crio-0d10495f5fe50ccc0b52083283d936eebca1318c7e0a62c1a55a1acd7133dd88 WatchSource:0}: Error finding container 0d10495f5fe50ccc0b52083283d936eebca1318c7e0a62c1a55a1acd7133dd88: Status 404 returned error can't find the container with id 0d10495f5fe50ccc0b52083283d936eebca1318c7e0a62c1a55a1acd7133dd88 Apr 16 17:53:53.920183 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.920147 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:53:53.982849 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:53.982827 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-n8vbf"] Apr 16 17:53:53.984965 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:53:53.984935 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7450c29_7c4d_4529_aa0e_811267d2a02c.slice/crio-80b673b076308a0aab6db29faf09593d1886732080599409be579a4cbc92c17c WatchSource:0}: Error finding container 80b673b076308a0aab6db29faf09593d1886732080599409be579a4cbc92c17c: Status 404 returned error can't find the container with id 80b673b076308a0aab6db29faf09593d1886732080599409be579a4cbc92c17c Apr 16 17:53:54.862001 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:54.861925 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" event={"ID":"f1bd190d-225f-47dd-a61d-73c6352f7eb8","Type":"ContainerStarted","Data":"0d10495f5fe50ccc0b52083283d936eebca1318c7e0a62c1a55a1acd7133dd88"} Apr 16 17:53:54.864123 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:54.864069 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-n8vbf" event={"ID":"b7450c29-7c4d-4529-aa0e-811267d2a02c","Type":"ContainerStarted","Data":"80b673b076308a0aab6db29faf09593d1886732080599409be579a4cbc92c17c"} Apr 16 17:53:57.876954 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:57.876924 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-n8vbf" event={"ID":"b7450c29-7c4d-4529-aa0e-811267d2a02c","Type":"ContainerStarted","Data":"f275a0beffe91999a4a641335d00a8c9ed647012e15478aa2857555bec9b242b"} Apr 16 17:53:57.877442 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:57.876988 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:53:57.878349 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:57.878325 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" event={"ID":"f1bd190d-225f-47dd-a61d-73c6352f7eb8","Type":"ContainerStarted","Data":"1daddb9a27057108312e77bd3279f14e4f81f64071f14cae15a975f11007928a"} Apr 16 17:53:57.878465 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:57.878439 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:53:57.897074 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:57.897022 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-n8vbf" podStartSLOduration=1.186394096 podStartE2EDuration="4.897005857s" podCreationTimestamp="2026-04-16 17:53:53 +0000 UTC" firstStartedPulling="2026-04-16 17:53:53.986306998 +0000 UTC m=+807.360481408" lastFinishedPulling="2026-04-16 17:53:57.696918749 +0000 UTC m=+811.071093169" observedRunningTime="2026-04-16 17:53:57.896110431 +0000 UTC m=+811.270284850" watchObservedRunningTime="2026-04-16 17:53:57.897005857 +0000 UTC m=+811.271180283" Apr 16 17:53:57.919992 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:53:57.919949 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" podStartSLOduration=1.241028998 podStartE2EDuration="4.919935996s" podCreationTimestamp="2026-04-16 17:53:53 +0000 UTC" firstStartedPulling="2026-04-16 17:53:53.920296477 +0000 UTC m=+807.294470882" lastFinishedPulling="2026-04-16 17:53:57.599203474 +0000 UTC m=+810.973377880" observedRunningTime="2026-04-16 17:53:57.917556761 +0000 UTC m=+811.291731191" watchObservedRunningTime="2026-04-16 17:53:57.919935996 +0000 UTC m=+811.294110423" Apr 16 17:54:03.883980 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:03.883892 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-n8vbf" Apr 16 17:54:28.887213 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:28.887150 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:54:29.747355 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.747319 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-f7wss"] Apr 16 17:54:29.747582 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.747538 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" podUID="f1bd190d-225f-47dd-a61d-73c6352f7eb8" containerName="manager" containerID="cri-o://1daddb9a27057108312e77bd3279f14e4f81f64071f14cae15a975f11007928a" gracePeriod=10 Apr 16 17:54:29.778191 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.778142 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-lfqzw"] Apr 16 17:54:29.783392 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.783375 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:29.790772 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.790740 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-lfqzw"] Apr 16 17:54:29.912028 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.911992 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24jg9\" (UniqueName: \"kubernetes.io/projected/d6cad08b-2470-4a70-bc37-d6f6e9e33c95-kube-api-access-24jg9\") pod \"kserve-controller-manager-7f8f4564d-lfqzw\" (UID: \"d6cad08b-2470-4a70-bc37-d6f6e9e33c95\") " pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:29.912411 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.912134 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6cad08b-2470-4a70-bc37-d6f6e9e33c95-cert\") pod \"kserve-controller-manager-7f8f4564d-lfqzw\" (UID: \"d6cad08b-2470-4a70-bc37-d6f6e9e33c95\") " pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:29.992790 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.992755 2580 generic.go:358] "Generic (PLEG): container finished" podID="f1bd190d-225f-47dd-a61d-73c6352f7eb8" containerID="1daddb9a27057108312e77bd3279f14e4f81f64071f14cae15a975f11007928a" exitCode=0 Apr 16 17:54:29.992967 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.992845 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" event={"ID":"f1bd190d-225f-47dd-a61d-73c6352f7eb8","Type":"ContainerDied","Data":"1daddb9a27057108312e77bd3279f14e4f81f64071f14cae15a975f11007928a"} Apr 16 17:54:29.992967 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.992884 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" event={"ID":"f1bd190d-225f-47dd-a61d-73c6352f7eb8","Type":"ContainerDied","Data":"0d10495f5fe50ccc0b52083283d936eebca1318c7e0a62c1a55a1acd7133dd88"} Apr 16 17:54:29.992967 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.992894 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d10495f5fe50ccc0b52083283d936eebca1318c7e0a62c1a55a1acd7133dd88" Apr 16 17:54:29.997605 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:29.997541 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:54:30.013509 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.013475 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6cad08b-2470-4a70-bc37-d6f6e9e33c95-cert\") pod \"kserve-controller-manager-7f8f4564d-lfqzw\" (UID: \"d6cad08b-2470-4a70-bc37-d6f6e9e33c95\") " pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:30.013680 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.013540 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24jg9\" (UniqueName: \"kubernetes.io/projected/d6cad08b-2470-4a70-bc37-d6f6e9e33c95-kube-api-access-24jg9\") pod \"kserve-controller-manager-7f8f4564d-lfqzw\" (UID: \"d6cad08b-2470-4a70-bc37-d6f6e9e33c95\") " pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:30.016270 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.016248 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6cad08b-2470-4a70-bc37-d6f6e9e33c95-cert\") pod \"kserve-controller-manager-7f8f4564d-lfqzw\" (UID: \"d6cad08b-2470-4a70-bc37-d6f6e9e33c95\") " pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:30.023918 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.023884 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24jg9\" (UniqueName: \"kubernetes.io/projected/d6cad08b-2470-4a70-bc37-d6f6e9e33c95-kube-api-access-24jg9\") pod \"kserve-controller-manager-7f8f4564d-lfqzw\" (UID: \"d6cad08b-2470-4a70-bc37-d6f6e9e33c95\") " pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:30.113904 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.113869 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwvms\" (UniqueName: \"kubernetes.io/projected/f1bd190d-225f-47dd-a61d-73c6352f7eb8-kube-api-access-xwvms\") pod \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\" (UID: \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\") " Apr 16 17:54:30.113904 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.113918 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1bd190d-225f-47dd-a61d-73c6352f7eb8-cert\") pod \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\" (UID: \"f1bd190d-225f-47dd-a61d-73c6352f7eb8\") " Apr 16 17:54:30.116199 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.116171 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1bd190d-225f-47dd-a61d-73c6352f7eb8-kube-api-access-xwvms" (OuterVolumeSpecName: "kube-api-access-xwvms") pod "f1bd190d-225f-47dd-a61d-73c6352f7eb8" (UID: "f1bd190d-225f-47dd-a61d-73c6352f7eb8"). InnerVolumeSpecName "kube-api-access-xwvms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:54:30.116319 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.116203 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1bd190d-225f-47dd-a61d-73c6352f7eb8-cert" (OuterVolumeSpecName: "cert") pod "f1bd190d-225f-47dd-a61d-73c6352f7eb8" (UID: "f1bd190d-225f-47dd-a61d-73c6352f7eb8"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:54:30.135526 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.135495 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:30.214878 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.214842 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwvms\" (UniqueName: \"kubernetes.io/projected/f1bd190d-225f-47dd-a61d-73c6352f7eb8-kube-api-access-xwvms\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:54:30.215008 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.214887 2580 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f1bd190d-225f-47dd-a61d-73c6352f7eb8-cert\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:54:30.261097 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.261046 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-lfqzw"] Apr 16 17:54:30.263254 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:54:30.263222 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6cad08b_2470_4a70_bc37_d6f6e9e33c95.slice/crio-2da5b53c3fecca37aaee594a80a063a97be2609619bd618bfffe75e716c4e74a WatchSource:0}: Error finding container 2da5b53c3fecca37aaee594a80a063a97be2609619bd618bfffe75e716c4e74a: Status 404 returned error can't find the container with id 2da5b53c3fecca37aaee594a80a063a97be2609619bd618bfffe75e716c4e74a Apr 16 17:54:30.997840 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.997802 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" event={"ID":"d6cad08b-2470-4a70-bc37-d6f6e9e33c95","Type":"ContainerStarted","Data":"76504ebec7c93d9e5f1271ae25c25ae235eb6a6aef3440dbbfd293c84cb29040"} Apr 16 17:54:30.998230 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.997851 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:54:30.998230 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.997864 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" event={"ID":"d6cad08b-2470-4a70-bc37-d6f6e9e33c95","Type":"ContainerStarted","Data":"2da5b53c3fecca37aaee594a80a063a97be2609619bd618bfffe75e716c4e74a"} Apr 16 17:54:30.998230 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:30.997822 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f8f4564d-f7wss" Apr 16 17:54:31.024916 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:31.024812 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" podStartSLOduration=1.475485597 podStartE2EDuration="2.024792615s" podCreationTimestamp="2026-04-16 17:54:29 +0000 UTC" firstStartedPulling="2026-04-16 17:54:30.264479363 +0000 UTC m=+843.638653769" lastFinishedPulling="2026-04-16 17:54:30.813786371 +0000 UTC m=+844.187960787" observedRunningTime="2026-04-16 17:54:31.022697003 +0000 UTC m=+844.396871430" watchObservedRunningTime="2026-04-16 17:54:31.024792615 +0000 UTC m=+844.398967045" Apr 16 17:54:31.047760 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:31.047729 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-f7wss"] Apr 16 17:54:31.054935 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:31.054905 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7f8f4564d-f7wss"] Apr 16 17:54:31.263625 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:54:31.263582 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1bd190d-225f-47dd-a61d-73c6352f7eb8" path="/var/lib/kubelet/pods/f1bd190d-225f-47dd-a61d-73c6352f7eb8/volumes" Apr 16 17:55:02.006069 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:02.006032 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f8f4564d-lfqzw" Apr 16 17:55:03.428759 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.428723 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-w6nmw"] Apr 16 17:55:03.429309 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.429283 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1bd190d-225f-47dd-a61d-73c6352f7eb8" containerName="manager" Apr 16 17:55:03.429309 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.429305 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bd190d-225f-47dd-a61d-73c6352f7eb8" containerName="manager" Apr 16 17:55:03.429503 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.429409 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1bd190d-225f-47dd-a61d-73c6352f7eb8" containerName="manager" Apr 16 17:55:03.432896 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.432877 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:03.434834 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.434812 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-nzs5f\"" Apr 16 17:55:03.434926 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.434848 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 17:55:03.442837 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.442544 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-2lx6g"] Apr 16 17:55:03.453078 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.453020 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-w6nmw"] Apr 16 17:55:03.453214 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.453148 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:03.455315 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.455288 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 17:55:03.455479 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.455416 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-429ws\"" Apr 16 17:55:03.456853 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.456833 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2lx6g"] Apr 16 17:55:03.507805 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.507769 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/220ff77b-8f49-4f08-afdb-942b4f149aa9-cert\") pod \"odh-model-controller-696fc77849-2lx6g\" (UID: \"220ff77b-8f49-4f08-afdb-942b4f149aa9\") " pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:03.507982 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.507890 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv9zp\" (UniqueName: \"kubernetes.io/projected/220ff77b-8f49-4f08-afdb-942b4f149aa9-kube-api-access-qv9zp\") pod \"odh-model-controller-696fc77849-2lx6g\" (UID: \"220ff77b-8f49-4f08-afdb-942b4f149aa9\") " pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:03.507982 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.507930 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84ef5904-1060-40fe-992b-4742e550121d-tls-certs\") pod \"model-serving-api-86f7b4b499-w6nmw\" (UID: \"84ef5904-1060-40fe-992b-4742e550121d\") " pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:03.507982 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.507976 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98jkz\" (UniqueName: \"kubernetes.io/projected/84ef5904-1060-40fe-992b-4742e550121d-kube-api-access-98jkz\") pod \"model-serving-api-86f7b4b499-w6nmw\" (UID: \"84ef5904-1060-40fe-992b-4742e550121d\") " pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:03.609030 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.609002 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jkz\" (UniqueName: \"kubernetes.io/projected/84ef5904-1060-40fe-992b-4742e550121d-kube-api-access-98jkz\") pod \"model-serving-api-86f7b4b499-w6nmw\" (UID: \"84ef5904-1060-40fe-992b-4742e550121d\") " pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:03.609250 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.609041 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/220ff77b-8f49-4f08-afdb-942b4f149aa9-cert\") pod \"odh-model-controller-696fc77849-2lx6g\" (UID: \"220ff77b-8f49-4f08-afdb-942b4f149aa9\") " pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:03.609250 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.609092 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qv9zp\" (UniqueName: \"kubernetes.io/projected/220ff77b-8f49-4f08-afdb-942b4f149aa9-kube-api-access-qv9zp\") pod \"odh-model-controller-696fc77849-2lx6g\" (UID: \"220ff77b-8f49-4f08-afdb-942b4f149aa9\") " pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:03.609250 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.609117 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84ef5904-1060-40fe-992b-4742e550121d-tls-certs\") pod \"model-serving-api-86f7b4b499-w6nmw\" (UID: \"84ef5904-1060-40fe-992b-4742e550121d\") " pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:03.611749 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.611727 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/220ff77b-8f49-4f08-afdb-942b4f149aa9-cert\") pod \"odh-model-controller-696fc77849-2lx6g\" (UID: \"220ff77b-8f49-4f08-afdb-942b4f149aa9\") " pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:03.611876 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.611856 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84ef5904-1060-40fe-992b-4742e550121d-tls-certs\") pod \"model-serving-api-86f7b4b499-w6nmw\" (UID: \"84ef5904-1060-40fe-992b-4742e550121d\") " pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:03.618203 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.618147 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv9zp\" (UniqueName: \"kubernetes.io/projected/220ff77b-8f49-4f08-afdb-942b4f149aa9-kube-api-access-qv9zp\") pod \"odh-model-controller-696fc77849-2lx6g\" (UID: \"220ff77b-8f49-4f08-afdb-942b4f149aa9\") " pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:03.618519 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.618502 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98jkz\" (UniqueName: \"kubernetes.io/projected/84ef5904-1060-40fe-992b-4742e550121d-kube-api-access-98jkz\") pod \"model-serving-api-86f7b4b499-w6nmw\" (UID: \"84ef5904-1060-40fe-992b-4742e550121d\") " pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:03.744741 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.744635 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:03.766551 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.766524 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:03.899606 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.899582 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-w6nmw"] Apr 16 17:55:03.902253 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:55:03.902224 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ef5904_1060_40fe_992b_4742e550121d.slice/crio-32bd26cac29162bf3f5a45a1f882f62c8947ae25caa04a6a4d471933ad38d30a WatchSource:0}: Error finding container 32bd26cac29162bf3f5a45a1f882f62c8947ae25caa04a6a4d471933ad38d30a: Status 404 returned error can't find the container with id 32bd26cac29162bf3f5a45a1f882f62c8947ae25caa04a6a4d471933ad38d30a Apr 16 17:55:03.911976 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:03.911956 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-2lx6g"] Apr 16 17:55:03.914308 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:55:03.914284 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod220ff77b_8f49_4f08_afdb_942b4f149aa9.slice/crio-41179a0fae809b2e8b98894d6c9cda86e47268ae51fa02b70c9483928e45ee8e WatchSource:0}: Error finding container 41179a0fae809b2e8b98894d6c9cda86e47268ae51fa02b70c9483928e45ee8e: Status 404 returned error can't find the container with id 41179a0fae809b2e8b98894d6c9cda86e47268ae51fa02b70c9483928e45ee8e Apr 16 17:55:04.112283 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:04.112198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2lx6g" event={"ID":"220ff77b-8f49-4f08-afdb-942b4f149aa9","Type":"ContainerStarted","Data":"41179a0fae809b2e8b98894d6c9cda86e47268ae51fa02b70c9483928e45ee8e"} Apr 16 17:55:04.113201 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:04.113150 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-w6nmw" event={"ID":"84ef5904-1060-40fe-992b-4742e550121d","Type":"ContainerStarted","Data":"32bd26cac29162bf3f5a45a1f882f62c8947ae25caa04a6a4d471933ad38d30a"} Apr 16 17:55:07.125964 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:07.125931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-2lx6g" event={"ID":"220ff77b-8f49-4f08-afdb-942b4f149aa9","Type":"ContainerStarted","Data":"c8553f3eba1526d99299986845d656c904e06684cb6c1cdc79da22b4a2c82d0b"} Apr 16 17:55:07.126431 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:07.125994 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:07.127266 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:07.127242 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-w6nmw" event={"ID":"84ef5904-1060-40fe-992b-4742e550121d","Type":"ContainerStarted","Data":"065585869aefcadd1f3999e1ac84f327b90fe09240202a532b966d6fff48ae72"} Apr 16 17:55:07.127356 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:07.127350 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:07.144390 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:07.144346 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-2lx6g" podStartSLOduration=1.373123732 podStartE2EDuration="4.144331781s" podCreationTimestamp="2026-04-16 17:55:03 +0000 UTC" firstStartedPulling="2026-04-16 17:55:03.915628365 +0000 UTC m=+877.289802770" lastFinishedPulling="2026-04-16 17:55:06.686836409 +0000 UTC m=+880.061010819" observedRunningTime="2026-04-16 17:55:07.142952833 +0000 UTC m=+880.517127262" watchObservedRunningTime="2026-04-16 17:55:07.144331781 +0000 UTC m=+880.518506208" Apr 16 17:55:07.160877 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:07.160830 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-w6nmw" podStartSLOduration=1.430917755 podStartE2EDuration="4.160814622s" podCreationTimestamp="2026-04-16 17:55:03 +0000 UTC" firstStartedPulling="2026-04-16 17:55:03.903870257 +0000 UTC m=+877.278044668" lastFinishedPulling="2026-04-16 17:55:06.633767111 +0000 UTC m=+880.007941535" observedRunningTime="2026-04-16 17:55:07.159979335 +0000 UTC m=+880.534153766" watchObservedRunningTime="2026-04-16 17:55:07.160814622 +0000 UTC m=+880.534989049" Apr 16 17:55:18.133620 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:18.133541 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-2lx6g" Apr 16 17:55:18.135598 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:18.135566 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-w6nmw" Apr 16 17:55:52.830139 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.830108 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss"] Apr 16 17:55:52.833988 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.833963 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:52.837232 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.837212 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 17:55:52.837367 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.837274 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 17:55:52.837367 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.837337 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 17:55:52.837462 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.837423 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:55:52.843874 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.843857 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss"] Apr 16 17:55:52.950280 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.950242 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-dshm\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:52.950476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.950292 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-home\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:52.950476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.950360 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbgc6\" (UniqueName: \"kubernetes.io/projected/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kube-api-access-qbgc6\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:52.950476 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.950420 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:52.950608 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.950535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:52.950608 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:52.950562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051324 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051283 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-dshm\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051324 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051328 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-home\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051535 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbgc6\" (UniqueName: \"kubernetes.io/projected/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kube-api-access-qbgc6\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051535 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051489 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051612 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051596 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051664 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051630 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051804 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051783 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-home\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051900 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051879 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-model-cache\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.051957 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.051916 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.053690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.053672 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-dshm\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.054201 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.054182 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.065216 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.065193 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbgc6\" (UniqueName: \"kubernetes.io/projected/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kube-api-access-qbgc6\") pod \"scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.145710 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.145619 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:55:53.290874 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:53.290852 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss"] Apr 16 17:55:53.294305 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:55:53.294276 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6dd2b56_b7e3_4d24_9118_c2cd6127aa70.slice/crio-67534c8b91e18c1b6115ba8ece1365e3d01683a9dd907fd174d487e900762b43 WatchSource:0}: Error finding container 67534c8b91e18c1b6115ba8ece1365e3d01683a9dd907fd174d487e900762b43: Status 404 returned error can't find the container with id 67534c8b91e18c1b6115ba8ece1365e3d01683a9dd907fd174d487e900762b43 Apr 16 17:55:54.295242 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:54.295204 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" event={"ID":"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70","Type":"ContainerStarted","Data":"67534c8b91e18c1b6115ba8ece1365e3d01683a9dd907fd174d487e900762b43"} Apr 16 17:55:57.307773 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:55:57.307742 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" event={"ID":"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70","Type":"ContainerStarted","Data":"541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84"} Apr 16 17:56:01.324641 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:01.324608 2580 generic.go:358] "Generic (PLEG): container finished" podID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" containerID="541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84" exitCode=0 Apr 16 17:56:01.325052 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:01.324685 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" event={"ID":"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70","Type":"ContainerDied","Data":"541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84"} Apr 16 17:56:03.335076 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:03.335044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" event={"ID":"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70","Type":"ContainerStarted","Data":"1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695"} Apr 16 17:56:03.360775 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:03.360703 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" podStartSLOduration=2.147886515 podStartE2EDuration="11.360683131s" podCreationTimestamp="2026-04-16 17:55:52 +0000 UTC" firstStartedPulling="2026-04-16 17:55:53.295972036 +0000 UTC m=+926.670146446" lastFinishedPulling="2026-04-16 17:56:02.508768657 +0000 UTC m=+935.882943062" observedRunningTime="2026-04-16 17:56:03.358587779 +0000 UTC m=+936.732762218" watchObservedRunningTime="2026-04-16 17:56:03.360683131 +0000 UTC m=+936.734857562" Apr 16 17:56:13.146071 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:13.146037 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:56:13.146071 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:13.146072 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:56:13.282558 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:13.158410 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:56:13.383665 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:13.383634 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:56:55.135350 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.135274 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w"] Apr 16 17:56:55.139520 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.139497 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.141919 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.141895 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 17:56:55.142033 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.141965 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-vlt5k\"" Apr 16 17:56:55.153294 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.153268 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w"] Apr 16 17:56:55.222688 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.222660 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.222848 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.222693 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.222848 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.222719 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krz4d\" (UniqueName: \"kubernetes.io/projected/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kube-api-access-krz4d\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.222848 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.222810 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.222954 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.222908 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.222988 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.222967 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.323956 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.323914 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.323956 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.323960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.324269 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.323987 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krz4d\" (UniqueName: \"kubernetes.io/projected/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kube-api-access-krz4d\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.324269 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.324020 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.324269 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.324112 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.324269 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.324219 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.324472 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.324386 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.324472 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.324447 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.324578 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.324521 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.324634 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.324610 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.326782 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.326757 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.334600 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.334580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krz4d\" (UniqueName: \"kubernetes.io/projected/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kube-api-access-krz4d\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.449880 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.449841 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:56:55.576692 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:55.576665 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w"] Apr 16 17:56:55.579066 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:56:55.579035 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfef957e7_255b_4f8b_bba8_8c8907f75ab7.slice/crio-08deb12fe7a8eafcb002b0d4c7b61650a10c2c4aaae7862e9d84067763511d14 WatchSource:0}: Error finding container 08deb12fe7a8eafcb002b0d4c7b61650a10c2c4aaae7862e9d84067763511d14: Status 404 returned error can't find the container with id 08deb12fe7a8eafcb002b0d4c7b61650a10c2c4aaae7862e9d84067763511d14 Apr 16 17:56:56.527056 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:56.527024 2580 generic.go:358] "Generic (PLEG): container finished" podID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerID="087aa48000c0213256f67e3b75b66fe3a47a08da30d1ac6ea8f648ce3018caed" exitCode=0 Apr 16 17:56:56.527454 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:56.527106 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" event={"ID":"fef957e7-255b-4f8b-bba8-8c8907f75ab7","Type":"ContainerDied","Data":"087aa48000c0213256f67e3b75b66fe3a47a08da30d1ac6ea8f648ce3018caed"} Apr 16 17:56:56.527454 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:56.527147 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" event={"ID":"fef957e7-255b-4f8b-bba8-8c8907f75ab7","Type":"ContainerStarted","Data":"08deb12fe7a8eafcb002b0d4c7b61650a10c2c4aaae7862e9d84067763511d14"} Apr 16 17:56:56.975728 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:56.975690 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss"] Apr 16 17:56:56.976048 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:56.976019 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" podUID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" containerName="main" containerID="cri-o://1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695" gracePeriod=30 Apr 16 17:56:57.470171 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.470134 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:56:57.531223 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.531196 2580 generic.go:358] "Generic (PLEG): container finished" podID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" containerID="1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695" exitCode=0 Apr 16 17:56:57.531500 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.531231 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" event={"ID":"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70","Type":"ContainerDied","Data":"1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695"} Apr 16 17:56:57.531500 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.531277 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" event={"ID":"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70","Type":"ContainerDied","Data":"67534c8b91e18c1b6115ba8ece1365e3d01683a9dd907fd174d487e900762b43"} Apr 16 17:56:57.531500 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.531284 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss" Apr 16 17:56:57.531500 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.531297 2580 scope.go:117] "RemoveContainer" containerID="1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695" Apr 16 17:56:57.549334 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.549249 2580 scope.go:117] "RemoveContainer" containerID="541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84" Apr 16 17:56:57.628711 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.628689 2580 scope.go:117] "RemoveContainer" containerID="1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695" Apr 16 17:56:57.629026 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:56:57.629002 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695\": container with ID starting with 1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695 not found: ID does not exist" containerID="1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695" Apr 16 17:56:57.629080 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.629038 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695"} err="failed to get container status \"1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695\": rpc error: code = NotFound desc = could not find container \"1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695\": container with ID starting with 1fb60c964643cec1a98097ed525ce8c48821c23c9b56680e78a7689686e04695 not found: ID does not exist" Apr 16 17:56:57.629080 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.629058 2580 scope.go:117] "RemoveContainer" containerID="541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84" Apr 16 17:56:57.629374 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:56:57.629360 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84\": container with ID starting with 541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84 not found: ID does not exist" containerID="541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84" Apr 16 17:56:57.629434 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.629380 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84"} err="failed to get container status \"541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84\": rpc error: code = NotFound desc = could not find container \"541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84\": container with ID starting with 541f9431d80efefd79ca7e0a85e9a24e8f8f6e15daf2b55f7305b28f874bfb84 not found: ID does not exist" Apr 16 17:56:57.645078 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645050 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-tls-certs\") pod \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " Apr 16 17:56:57.645202 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645098 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-home\") pod \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " Apr 16 17:56:57.645254 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645206 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-model-cache\") pod \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " Apr 16 17:56:57.645309 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645278 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-dshm\") pod \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " Apr 16 17:56:57.645363 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645324 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbgc6\" (UniqueName: \"kubernetes.io/projected/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kube-api-access-qbgc6\") pod \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " Apr 16 17:56:57.645363 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645358 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kserve-provision-location\") pod \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\" (UID: \"a6dd2b56-b7e3-4d24-9118-c2cd6127aa70\") " Apr 16 17:56:57.645470 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645366 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-home" (OuterVolumeSpecName: "home") pod "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" (UID: "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:56:57.645728 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645513 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-model-cache" (OuterVolumeSpecName: "model-cache") pod "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" (UID: "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:56:57.645728 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645645 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-model-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:56:57.645728 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.645664 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-home\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:56:57.647530 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.647498 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" (UID: "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:56:57.647618 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.647580 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kube-api-access-qbgc6" (OuterVolumeSpecName: "kube-api-access-qbgc6") pod "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" (UID: "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70"). InnerVolumeSpecName "kube-api-access-qbgc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:56:57.647836 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.647818 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-dshm" (OuterVolumeSpecName: "dshm") pod "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" (UID: "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:56:57.703836 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.703755 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" (UID: "a6dd2b56-b7e3-4d24-9118-c2cd6127aa70"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:56:57.747106 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.747065 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-dshm\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:56:57.747106 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.747116 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbgc6\" (UniqueName: \"kubernetes.io/projected/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kube-api-access-qbgc6\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:56:57.747347 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.747128 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:56:57.747347 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.747145 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:56:57.861726 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.861681 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss"] Apr 16 17:56:57.869207 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:57.869186 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-6fd7696646-bt8ss"] Apr 16 17:56:58.537633 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:58.537589 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" event={"ID":"fef957e7-255b-4f8b-bba8-8c8907f75ab7","Type":"ContainerStarted","Data":"063ec38e8a54c8862bfb1433fb3035661878432cbfe0f46450d88dc7b12c0ff9"} Apr 16 17:56:59.266647 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:56:59.266612 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" path="/var/lib/kubelet/pods/a6dd2b56-b7e3-4d24-9118-c2cd6127aa70/volumes" Apr 16 17:57:28.664376 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:28.664273 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" event={"ID":"fef957e7-255b-4f8b-bba8-8c8907f75ab7","Type":"ContainerStarted","Data":"803cd47e6a2c04f933573e0b335d8ed18b2d685577cea5a494c44906d0bf8d9c"} Apr 16 17:57:28.664841 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:28.664474 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:57:28.666884 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:28.666863 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:57:28.691002 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:28.690947 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" podStartSLOduration=1.878166306 podStartE2EDuration="33.690931794s" podCreationTimestamp="2026-04-16 17:56:55 +0000 UTC" firstStartedPulling="2026-04-16 17:56:56.528177054 +0000 UTC m=+989.902351459" lastFinishedPulling="2026-04-16 17:57:28.340942538 +0000 UTC m=+1021.715116947" observedRunningTime="2026-04-16 17:57:28.689717827 +0000 UTC m=+1022.063892269" watchObservedRunningTime="2026-04-16 17:57:28.690931794 +0000 UTC m=+1022.065106222" Apr 16 17:57:35.450526 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:35.450490 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:57:35.450526 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:35.450529 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:57:35.451051 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:35.450783 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.35:8082/healthz\": dial tcp 10.134.0.35:8082: connect: connection refused" Apr 16 17:57:45.452245 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:45.452216 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:57:45.453358 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:45.453332 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:57:56.460571 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:56.460534 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w"] Apr 16 17:57:56.460968 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:56.460835 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="main" containerID="cri-o://063ec38e8a54c8862bfb1433fb3035661878432cbfe0f46450d88dc7b12c0ff9" gracePeriod=30 Apr 16 17:57:56.460968 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:56.460870 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="tokenizer" containerID="cri-o://803cd47e6a2c04f933573e0b335d8ed18b2d685577cea5a494c44906d0bf8d9c" gracePeriod=30 Apr 16 17:57:57.780151 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.780120 2580 generic.go:358] "Generic (PLEG): container finished" podID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerID="803cd47e6a2c04f933573e0b335d8ed18b2d685577cea5a494c44906d0bf8d9c" exitCode=0 Apr 16 17:57:57.780151 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.780143 2580 generic.go:358] "Generic (PLEG): container finished" podID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerID="063ec38e8a54c8862bfb1433fb3035661878432cbfe0f46450d88dc7b12c0ff9" exitCode=0 Apr 16 17:57:57.780527 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.780201 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" event={"ID":"fef957e7-255b-4f8b-bba8-8c8907f75ab7","Type":"ContainerDied","Data":"803cd47e6a2c04f933573e0b335d8ed18b2d685577cea5a494c44906d0bf8d9c"} Apr 16 17:57:57.780527 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.780246 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" event={"ID":"fef957e7-255b-4f8b-bba8-8c8907f75ab7","Type":"ContainerDied","Data":"063ec38e8a54c8862bfb1433fb3035661878432cbfe0f46450d88dc7b12c0ff9"} Apr 16 17:57:57.835955 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.835933 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:57:57.916616 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.916584 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krz4d\" (UniqueName: \"kubernetes.io/projected/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kube-api-access-krz4d\") pod \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " Apr 16 17:57:57.916812 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.916637 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kserve-provision-location\") pod \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " Apr 16 17:57:57.916812 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.916656 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-tmp\") pod \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " Apr 16 17:57:57.916812 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.916700 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tls-certs\") pod \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " Apr 16 17:57:57.916812 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.916727 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-uds\") pod \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " Apr 16 17:57:57.916812 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.916747 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-cache\") pod \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\" (UID: \"fef957e7-255b-4f8b-bba8-8c8907f75ab7\") " Apr 16 17:57:57.917075 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.916997 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fef957e7-255b-4f8b-bba8-8c8907f75ab7" (UID: "fef957e7-255b-4f8b-bba8-8c8907f75ab7"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:57.917075 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.917019 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fef957e7-255b-4f8b-bba8-8c8907f75ab7" (UID: "fef957e7-255b-4f8b-bba8-8c8907f75ab7"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:57.917075 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.917031 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fef957e7-255b-4f8b-bba8-8c8907f75ab7" (UID: "fef957e7-255b-4f8b-bba8-8c8907f75ab7"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:57.917381 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.917359 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fef957e7-255b-4f8b-bba8-8c8907f75ab7" (UID: "fef957e7-255b-4f8b-bba8-8c8907f75ab7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:57:57.919058 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.919038 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kube-api-access-krz4d" (OuterVolumeSpecName: "kube-api-access-krz4d") pod "fef957e7-255b-4f8b-bba8-8c8907f75ab7" (UID: "fef957e7-255b-4f8b-bba8-8c8907f75ab7"). InnerVolumeSpecName "kube-api-access-krz4d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:57:57.919148 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:57.919080 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fef957e7-255b-4f8b-bba8-8c8907f75ab7" (UID: "fef957e7-255b-4f8b-bba8-8c8907f75ab7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:57:58.017977 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.017942 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-krz4d\" (UniqueName: \"kubernetes.io/projected/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kube-api-access-krz4d\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.017977 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.017974 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.018187 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.017988 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-tmp\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.018187 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.017997 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.018187 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.018006 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-uds\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.018187 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.018013 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fef957e7-255b-4f8b-bba8-8c8907f75ab7-tokenizer-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:57:58.785764 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.785738 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" Apr 16 17:57:58.786343 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.785735 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w" event={"ID":"fef957e7-255b-4f8b-bba8-8c8907f75ab7","Type":"ContainerDied","Data":"08deb12fe7a8eafcb002b0d4c7b61650a10c2c4aaae7862e9d84067763511d14"} Apr 16 17:57:58.786343 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.785859 2580 scope.go:117] "RemoveContainer" containerID="803cd47e6a2c04f933573e0b335d8ed18b2d685577cea5a494c44906d0bf8d9c" Apr 16 17:57:58.794315 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.794297 2580 scope.go:117] "RemoveContainer" containerID="063ec38e8a54c8862bfb1433fb3035661878432cbfe0f46450d88dc7b12c0ff9" Apr 16 17:57:58.802084 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.802067 2580 scope.go:117] "RemoveContainer" containerID="087aa48000c0213256f67e3b75b66fe3a47a08da30d1ac6ea8f648ce3018caed" Apr 16 17:57:58.814582 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.814555 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w"] Apr 16 17:57:58.816504 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:58.816483 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d5d4d87887w"] Apr 16 17:57:59.263378 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:57:59.263346 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" path="/var/lib/kubelet/pods/fef957e7-255b-4f8b-bba8-8c8907f75ab7/volumes" Apr 16 17:58:16.511843 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.511768 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq"] Apr 16 17:58:16.512211 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512182 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="tokenizer" Apr 16 17:58:16.512211 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512193 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="tokenizer" Apr 16 17:58:16.512287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512216 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" containerName="storage-initializer" Apr 16 17:58:16.512287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512221 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" containerName="storage-initializer" Apr 16 17:58:16.512287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512227 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="storage-initializer" Apr 16 17:58:16.512287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512232 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="storage-initializer" Apr 16 17:58:16.512287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512238 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="main" Apr 16 17:58:16.512287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512243 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="main" Apr 16 17:58:16.512287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512253 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" containerName="main" Apr 16 17:58:16.512287 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512258 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" containerName="main" Apr 16 17:58:16.512514 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512312 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="main" Apr 16 17:58:16.512514 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512322 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="fef957e7-255b-4f8b-bba8-8c8907f75ab7" containerName="tokenizer" Apr 16 17:58:16.512514 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.512329 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6dd2b56-b7e3-4d24-9118-c2cd6127aa70" containerName="main" Apr 16 17:58:16.515450 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.515431 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.524401 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.524375 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 17:58:16.524510 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.524460 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 17:58:16.524958 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.524941 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:58:16.531457 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.531431 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq"] Apr 16 17:58:16.538171 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.538137 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 17:58:16.585742 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.585706 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2be45b1-7216-4bad-ac53-ba227989a660-tls-certs\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.585907 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.585747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg2rf\" (UniqueName: \"kubernetes.io/projected/a2be45b1-7216-4bad-ac53-ba227989a660-kube-api-access-vg2rf\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.585907 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.585767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-model-cache\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.585907 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.585861 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-dshm\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.586011 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.585927 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.586011 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.585951 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-home\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687121 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687072 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687121 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-home\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687351 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2be45b1-7216-4bad-ac53-ba227989a660-tls-certs\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687351 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687253 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg2rf\" (UniqueName: \"kubernetes.io/projected/a2be45b1-7216-4bad-ac53-ba227989a660-kube-api-access-vg2rf\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687427 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687390 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-model-cache\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687484 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687454 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-dshm\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687541 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687531 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687594 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687568 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-home\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.687763 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.687742 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-model-cache\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.689600 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.689583 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-dshm\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.689855 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.689840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2be45b1-7216-4bad-ac53-ba227989a660-tls-certs\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.706513 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.706490 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg2rf\" (UniqueName: \"kubernetes.io/projected/a2be45b1-7216-4bad-ac53-ba227989a660-kube-api-access-vg2rf\") pod \"precise-prefix-cache-test-kserve-6cc77f558b-q9mvq\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.825289 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.825195 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:16.960572 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.960539 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87"] Apr 16 17:58:16.965629 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.965609 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:16.975975 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:16.975949 2580 status_manager.go:895] "Failed to get status for pod" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" err="pods \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" is forbidden: User \"system:node:ip-10-0-143-216.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-143-216.ec2.internal' and this object" Apr 16 17:58:16.976378 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:58:16.976354 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"precise-prefix-cache-test-epp-sa-dockercfg-hb8fw\" is forbidden: User \"system:node:ip-10-0-143-216.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-143-216.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-hb8fw\"" type="*v1.Secret" Apr 16 17:58:17.000746 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.000713 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq"] Apr 16 17:58:17.002002 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:58:17.001972 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2be45b1_7216_4bad_ac53_ba227989a660.slice/crio-5627645a1486ba4ee67bead0d8a29686b2259c648172719ea853e75cf931f830 WatchSource:0}: Error finding container 5627645a1486ba4ee67bead0d8a29686b2259c648172719ea853e75cf931f830: Status 404 returned error can't find the container with id 5627645a1486ba4ee67bead0d8a29686b2259c648172719ea853e75cf931f830 Apr 16 17:58:17.033683 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.033658 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87"] Apr 16 17:58:17.090979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.090889 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.090979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.090929 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.090979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.090964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.091277 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.091055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.091277 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.091085 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kube-api-access-kbhnx\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.091277 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.091125 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192246 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192211 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192246 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192249 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192545 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192269 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192545 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192545 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192337 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kube-api-access-kbhnx\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192545 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192754 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192692 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192816 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192747 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192816 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192796 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.192888 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.192837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.194911 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.194886 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.231744 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.231712 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kube-api-access-kbhnx\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:17.857840 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.857807 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" event={"ID":"a2be45b1-7216-4bad-ac53-ba227989a660","Type":"ContainerStarted","Data":"80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157"} Apr 16 17:58:17.857840 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.857844 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" event={"ID":"a2be45b1-7216-4bad-ac53-ba227989a660","Type":"ContainerStarted","Data":"5627645a1486ba4ee67bead0d8a29686b2259c648172719ea853e75cf931f830"} Apr 16 17:58:17.972717 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.972677 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-hb8fw\"" Apr 16 17:58:17.975570 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:17.975544 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:18.160350 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:18.160272 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87"] Apr 16 17:58:18.164913 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:58:18.164875 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b140d1d_e742_4682_adcb_689fd7ffb0a6.slice/crio-908ad268dc1af613339852262aa36ac91ef03960dbfc7ea8dda35e28f449d301 WatchSource:0}: Error finding container 908ad268dc1af613339852262aa36ac91ef03960dbfc7ea8dda35e28f449d301: Status 404 returned error can't find the container with id 908ad268dc1af613339852262aa36ac91ef03960dbfc7ea8dda35e28f449d301 Apr 16 17:58:18.863232 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:18.863183 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" event={"ID":"0b140d1d-e742-4682-adcb-689fd7ffb0a6","Type":"ContainerStarted","Data":"84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5"} Apr 16 17:58:18.863232 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:18.863243 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" event={"ID":"0b140d1d-e742-4682-adcb-689fd7ffb0a6","Type":"ContainerStarted","Data":"908ad268dc1af613339852262aa36ac91ef03960dbfc7ea8dda35e28f449d301"} Apr 16 17:58:19.868451 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:19.868416 2580 generic.go:358] "Generic (PLEG): container finished" podID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerID="84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5" exitCode=0 Apr 16 17:58:19.868843 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:19.868510 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" event={"ID":"0b140d1d-e742-4682-adcb-689fd7ffb0a6","Type":"ContainerDied","Data":"84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5"} Apr 16 17:58:20.875230 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:20.875192 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" event={"ID":"0b140d1d-e742-4682-adcb-689fd7ffb0a6","Type":"ContainerStarted","Data":"f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c"} Apr 16 17:58:20.875230 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:20.875234 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" event={"ID":"0b140d1d-e742-4682-adcb-689fd7ffb0a6","Type":"ContainerStarted","Data":"dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63"} Apr 16 17:58:20.875639 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:20.875338 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:20.907930 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:20.907876 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" podStartSLOduration=4.907861311 podStartE2EDuration="4.907861311s" podCreationTimestamp="2026-04-16 17:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:58:20.90750542 +0000 UTC m=+1074.281679848" watchObservedRunningTime="2026-04-16 17:58:20.907861311 +0000 UTC m=+1074.282035738" Apr 16 17:58:21.880289 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:21.880253 2580 generic.go:358] "Generic (PLEG): container finished" podID="a2be45b1-7216-4bad-ac53-ba227989a660" containerID="80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157" exitCode=0 Apr 16 17:58:21.880716 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:21.880340 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" event={"ID":"a2be45b1-7216-4bad-ac53-ba227989a660","Type":"ContainerDied","Data":"80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157"} Apr 16 17:58:22.886690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:22.886658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" event={"ID":"a2be45b1-7216-4bad-ac53-ba227989a660","Type":"ContainerStarted","Data":"7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c"} Apr 16 17:58:26.826121 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:26.826077 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:26.826597 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:26.826133 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:26.839427 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:26.839386 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:26.866033 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:26.865979 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" podStartSLOduration=10.865960098 podStartE2EDuration="10.865960098s" podCreationTimestamp="2026-04-16 17:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:58:22.9190194 +0000 UTC m=+1076.293193827" watchObservedRunningTime="2026-04-16 17:58:26.865960098 +0000 UTC m=+1080.240134528" Apr 16 17:58:26.913847 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:26.913821 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:27.975815 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:27.975778 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:27.975815 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:27.975824 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:27.977246 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:58:27.977227 2580 logging.go:55] [core] [Channel #35 SubChannel #36]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 16 17:58:27.978562 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:27.978541 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:28.910769 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:28.910742 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:28.976133 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:28.976086 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 16 17:58:37.976437 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:58:37.976405 2580 logging.go:55] [core] [Channel #37 SubChannel #38]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.37:9003", ServerName: "10.134.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.37:9003: connect: connection refused" Apr 16 17:58:38.976681 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:38.976637 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.37:9003\" within 1s: context deadline exceeded" Apr 16 17:58:49.914133 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:49.914105 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:52.097931 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.097892 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87"] Apr 16 17:58:52.098339 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.098173 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="main" containerID="cri-o://dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63" gracePeriod=30 Apr 16 17:58:52.098339 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.098267 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="tokenizer" containerID="cri-o://f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c" gracePeriod=30 Apr 16 17:58:52.101249 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.101224 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq"] Apr 16 17:58:52.101792 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.101743 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" podUID="a2be45b1-7216-4bad-ac53-ba227989a660" containerName="main" containerID="cri-o://7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c" gracePeriod=30 Apr 16 17:58:52.354344 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.354286 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:52.422474 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.422441 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-dshm\") pod \"a2be45b1-7216-4bad-ac53-ba227989a660\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " Apr 16 17:58:52.422676 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.422544 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-kserve-provision-location\") pod \"a2be45b1-7216-4bad-ac53-ba227989a660\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " Apr 16 17:58:52.422676 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.422583 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-model-cache\") pod \"a2be45b1-7216-4bad-ac53-ba227989a660\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " Apr 16 17:58:52.422676 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.422621 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-home\") pod \"a2be45b1-7216-4bad-ac53-ba227989a660\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " Apr 16 17:58:52.422676 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.422671 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg2rf\" (UniqueName: \"kubernetes.io/projected/a2be45b1-7216-4bad-ac53-ba227989a660-kube-api-access-vg2rf\") pod \"a2be45b1-7216-4bad-ac53-ba227989a660\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " Apr 16 17:58:52.422886 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.422704 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2be45b1-7216-4bad-ac53-ba227989a660-tls-certs\") pod \"a2be45b1-7216-4bad-ac53-ba227989a660\" (UID: \"a2be45b1-7216-4bad-ac53-ba227989a660\") " Apr 16 17:58:52.422935 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.422885 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-model-cache" (OuterVolumeSpecName: "model-cache") pod "a2be45b1-7216-4bad-ac53-ba227989a660" (UID: "a2be45b1-7216-4bad-ac53-ba227989a660"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:52.422935 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.422899 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-home" (OuterVolumeSpecName: "home") pod "a2be45b1-7216-4bad-ac53-ba227989a660" (UID: "a2be45b1-7216-4bad-ac53-ba227989a660"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:52.423068 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.423045 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-model-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.423142 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.423068 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-home\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.424889 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.424859 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-dshm" (OuterVolumeSpecName: "dshm") pod "a2be45b1-7216-4bad-ac53-ba227989a660" (UID: "a2be45b1-7216-4bad-ac53-ba227989a660"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:52.425078 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.425055 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2be45b1-7216-4bad-ac53-ba227989a660-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a2be45b1-7216-4bad-ac53-ba227989a660" (UID: "a2be45b1-7216-4bad-ac53-ba227989a660"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:58:52.425694 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.425674 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2be45b1-7216-4bad-ac53-ba227989a660-kube-api-access-vg2rf" (OuterVolumeSpecName: "kube-api-access-vg2rf") pod "a2be45b1-7216-4bad-ac53-ba227989a660" (UID: "a2be45b1-7216-4bad-ac53-ba227989a660"). InnerVolumeSpecName "kube-api-access-vg2rf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:58:52.480692 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.480631 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a2be45b1-7216-4bad-ac53-ba227989a660" (UID: "a2be45b1-7216-4bad-ac53-ba227989a660"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:52.523985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.523955 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2be45b1-7216-4bad-ac53-ba227989a660-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.523985 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.523982 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-dshm\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.524209 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.523994 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2be45b1-7216-4bad-ac53-ba227989a660-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.524209 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.524004 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vg2rf\" (UniqueName: \"kubernetes.io/projected/a2be45b1-7216-4bad-ac53-ba227989a660-kube-api-access-vg2rf\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:52.997349 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.997317 2580 generic.go:358] "Generic (PLEG): container finished" podID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerID="dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63" exitCode=0 Apr 16 17:58:52.997534 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.997391 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" event={"ID":"0b140d1d-e742-4682-adcb-689fd7ffb0a6","Type":"ContainerDied","Data":"dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63"} Apr 16 17:58:52.998810 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.998787 2580 generic.go:358] "Generic (PLEG): container finished" podID="a2be45b1-7216-4bad-ac53-ba227989a660" containerID="7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c" exitCode=0 Apr 16 17:58:52.998930 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.998821 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" event={"ID":"a2be45b1-7216-4bad-ac53-ba227989a660","Type":"ContainerDied","Data":"7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c"} Apr 16 17:58:52.998930 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.998857 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" Apr 16 17:58:52.998930 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.998861 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq" event={"ID":"a2be45b1-7216-4bad-ac53-ba227989a660","Type":"ContainerDied","Data":"5627645a1486ba4ee67bead0d8a29686b2259c648172719ea853e75cf931f830"} Apr 16 17:58:52.998930 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:52.998883 2580 scope.go:117] "RemoveContainer" containerID="7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c" Apr 16 17:58:53.010389 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.010363 2580 scope.go:117] "RemoveContainer" containerID="80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157" Apr 16 17:58:53.021949 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.021922 2580 scope.go:117] "RemoveContainer" containerID="7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c" Apr 16 17:58:53.022300 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:58:53.022275 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c\": container with ID starting with 7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c not found: ID does not exist" containerID="7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c" Apr 16 17:58:53.022430 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.022310 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c"} err="failed to get container status \"7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c\": rpc error: code = NotFound desc = could not find container \"7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c\": container with ID starting with 7f44830c22fa28e46f5f76aa0fbb5dd1e3523640bba65ae44337daf3caf3259c not found: ID does not exist" Apr 16 17:58:53.022430 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.022340 2580 scope.go:117] "RemoveContainer" containerID="80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157" Apr 16 17:58:53.022654 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:58:53.022638 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157\": container with ID starting with 80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157 not found: ID does not exist" containerID="80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157" Apr 16 17:58:53.022708 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.022663 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157"} err="failed to get container status \"80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157\": rpc error: code = NotFound desc = could not find container \"80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157\": container with ID starting with 80fc4cc545a919004833e547e2f8e11db92601927a7429058d4e483466ace157 not found: ID does not exist" Apr 16 17:58:53.043981 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.036906 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq"] Apr 16 17:58:53.116763 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.116729 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6cc77f558b-q9mvq"] Apr 16 17:58:53.264649 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.264568 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2be45b1-7216-4bad-ac53-ba227989a660" path="/var/lib/kubelet/pods/a2be45b1-7216-4bad-ac53-ba227989a660/volumes" Apr 16 17:58:53.656428 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.656406 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:53.736443 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736410 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-tmp\") pod \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " Apr 16 17:58:53.736595 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736514 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-uds\") pod \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " Apr 16 17:58:53.736595 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736538 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kserve-provision-location\") pod \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " Apr 16 17:58:53.736595 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736580 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-cache\") pod \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " Apr 16 17:58:53.736711 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736613 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tls-certs\") pod \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " Apr 16 17:58:53.736711 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736638 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kube-api-access-kbhnx\") pod \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\" (UID: \"0b140d1d-e742-4682-adcb-689fd7ffb0a6\") " Apr 16 17:58:53.736806 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736769 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0b140d1d-e742-4682-adcb-689fd7ffb0a6" (UID: "0b140d1d-e742-4682-adcb-689fd7ffb0a6"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:53.736806 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736791 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0b140d1d-e742-4682-adcb-689fd7ffb0a6" (UID: "0b140d1d-e742-4682-adcb-689fd7ffb0a6"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:53.736953 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736934 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-tmp\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:53.737011 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.736961 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-uds\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:53.737079 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.737053 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0b140d1d-e742-4682-adcb-689fd7ffb0a6" (UID: "0b140d1d-e742-4682-adcb-689fd7ffb0a6"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:53.737320 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.737303 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b140d1d-e742-4682-adcb-689fd7ffb0a6" (UID: "0b140d1d-e742-4682-adcb-689fd7ffb0a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:53.738984 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.738959 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kube-api-access-kbhnx" (OuterVolumeSpecName: "kube-api-access-kbhnx") pod "0b140d1d-e742-4682-adcb-689fd7ffb0a6" (UID: "0b140d1d-e742-4682-adcb-689fd7ffb0a6"). InnerVolumeSpecName "kube-api-access-kbhnx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:58:53.739093 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.739047 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0b140d1d-e742-4682-adcb-689fd7ffb0a6" (UID: "0b140d1d-e742-4682-adcb-689fd7ffb0a6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:58:53.837795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.837707 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:53.837795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.837745 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tokenizer-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:53.837795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.837758 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b140d1d-e742-4682-adcb-689fd7ffb0a6-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:53.837795 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:53.837766 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbhnx\" (UniqueName: \"kubernetes.io/projected/0b140d1d-e742-4682-adcb-689fd7ffb0a6-kube-api-access-kbhnx\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 17:58:54.006037 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.006003 2580 generic.go:358] "Generic (PLEG): container finished" podID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerID="f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c" exitCode=0 Apr 16 17:58:54.006216 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.006112 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" event={"ID":"0b140d1d-e742-4682-adcb-689fd7ffb0a6","Type":"ContainerDied","Data":"f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c"} Apr 16 17:58:54.006216 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.006144 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" Apr 16 17:58:54.006216 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.006176 2580 scope.go:117] "RemoveContainer" containerID="f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c" Apr 16 17:58:54.006316 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.006144 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87" event={"ID":"0b140d1d-e742-4682-adcb-689fd7ffb0a6","Type":"ContainerDied","Data":"908ad268dc1af613339852262aa36ac91ef03960dbfc7ea8dda35e28f449d301"} Apr 16 17:58:54.014816 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.014794 2580 scope.go:117] "RemoveContainer" containerID="dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63" Apr 16 17:58:54.023046 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.023026 2580 scope.go:117] "RemoveContainer" containerID="84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5" Apr 16 17:58:54.030402 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.030381 2580 scope.go:117] "RemoveContainer" containerID="f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c" Apr 16 17:58:54.030667 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:58:54.030645 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c\": container with ID starting with f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c not found: ID does not exist" containerID="f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c" Apr 16 17:58:54.030715 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.030677 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c"} err="failed to get container status \"f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c\": rpc error: code = NotFound desc = could not find container \"f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c\": container with ID starting with f89db0ebc53973dcbfe821e826bc1a66ed1a217e0ddcec868a517a2ea4c7613c not found: ID does not exist" Apr 16 17:58:54.030715 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.030698 2580 scope.go:117] "RemoveContainer" containerID="dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63" Apr 16 17:58:54.030919 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:58:54.030901 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63\": container with ID starting with dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63 not found: ID does not exist" containerID="dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63" Apr 16 17:58:54.030959 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.030927 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63"} err="failed to get container status \"dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63\": rpc error: code = NotFound desc = could not find container \"dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63\": container with ID starting with dfab1e82eec101b04be915452f2e8e83c8b95e4a9f5d60d4f0278c4a34e65c63 not found: ID does not exist" Apr 16 17:58:54.030959 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.030944 2580 scope.go:117] "RemoveContainer" containerID="84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5" Apr 16 17:58:54.031242 ip-10-0-143-216 kubenswrapper[2580]: E0416 17:58:54.031219 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5\": container with ID starting with 84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5 not found: ID does not exist" containerID="84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5" Apr 16 17:58:54.031285 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.031252 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5"} err="failed to get container status \"84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5\": rpc error: code = NotFound desc = could not find container \"84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5\": container with ID starting with 84ebe8e8bed533bd316953245c462dc494e778605d79df7683e1c72213e7ecf5 not found: ID does not exist" Apr 16 17:58:54.041731 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.041710 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87"] Apr 16 17:58:54.047556 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:54.047534 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5dfd97d8snd87"] Apr 16 17:58:55.264241 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:58:55.264209 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" path="/var/lib/kubelet/pods/0b140d1d-e742-4682-adcb-689fd7ffb0a6/volumes" Apr 16 17:59:02.471428 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.471380 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7"] Apr 16 17:59:02.471954 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.471934 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="tokenizer" Apr 16 17:59:02.472028 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.471958 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="tokenizer" Apr 16 17:59:02.472028 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.471982 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2be45b1-7216-4bad-ac53-ba227989a660" containerName="storage-initializer" Apr 16 17:59:02.472028 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.471991 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2be45b1-7216-4bad-ac53-ba227989a660" containerName="storage-initializer" Apr 16 17:59:02.472028 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472009 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="storage-initializer" Apr 16 17:59:02.472028 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472017 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="storage-initializer" Apr 16 17:59:02.472314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472037 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="main" Apr 16 17:59:02.472314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472045 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="main" Apr 16 17:59:02.472314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472055 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2be45b1-7216-4bad-ac53-ba227989a660" containerName="main" Apr 16 17:59:02.472314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472063 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2be45b1-7216-4bad-ac53-ba227989a660" containerName="main" Apr 16 17:59:02.472314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472196 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="tokenizer" Apr 16 17:59:02.472314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472212 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2be45b1-7216-4bad-ac53-ba227989a660" containerName="main" Apr 16 17:59:02.472314 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.472226 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b140d1d-e742-4682-adcb-689fd7ffb0a6" containerName="main" Apr 16 17:59:02.477717 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.477685 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.488851 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.488802 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 17:59:02.489069 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.489052 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 17:59:02.489365 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.489343 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-2ks6n\"" Apr 16 17:59:02.489475 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.489409 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 17:59:02.491352 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.491330 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7"] Apr 16 17:59:02.493553 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.493540 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:59:02.620551 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.620519 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.620551 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.620555 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.620765 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.620573 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8dg\" (UniqueName: \"kubernetes.io/projected/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kube-api-access-pb8dg\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.620765 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.620600 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.620765 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.620654 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.620765 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.620712 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.721690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.721602 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.721690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.721636 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.721690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.721655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8dg\" (UniqueName: \"kubernetes.io/projected/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kube-api-access-pb8dg\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.721690 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.721673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.722022 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.721705 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.722022 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.721728 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.722129 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.722040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.722129 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.722070 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.722129 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.722098 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.722277 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.722189 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.724268 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.724252 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.738574 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.738548 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8dg\" (UniqueName: \"kubernetes.io/projected/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kube-api-access-pb8dg\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-jhtm7\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.787751 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.787711 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:02.931993 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.931966 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7"] Apr 16 17:59:02.933803 ip-10-0-143-216 kubenswrapper[2580]: W0416 17:59:02.933777 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb36fb0_8259_4a7d_ac8e_d627ff4b1013.slice/crio-4f65829082b5997292fc383274aa51b38c1a93c5d90ec56f331438c4127d3c16 WatchSource:0}: Error finding container 4f65829082b5997292fc383274aa51b38c1a93c5d90ec56f331438c4127d3c16: Status 404 returned error can't find the container with id 4f65829082b5997292fc383274aa51b38c1a93c5d90ec56f331438c4127d3c16 Apr 16 17:59:02.935675 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:02.935658 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:59:03.042298 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:03.042259 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" event={"ID":"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013","Type":"ContainerStarted","Data":"02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590"} Apr 16 17:59:03.042457 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:03.042305 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" event={"ID":"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013","Type":"ContainerStarted","Data":"4f65829082b5997292fc383274aa51b38c1a93c5d90ec56f331438c4127d3c16"} Apr 16 17:59:04.047468 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:04.047438 2580 generic.go:358] "Generic (PLEG): container finished" podID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerID="02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590" exitCode=0 Apr 16 17:59:04.047873 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:04.047495 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" event={"ID":"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013","Type":"ContainerDied","Data":"02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590"} Apr 16 17:59:05.056118 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:05.056080 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" event={"ID":"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013","Type":"ContainerStarted","Data":"4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd"} Apr 16 17:59:05.056118 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:05.056119 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" event={"ID":"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013","Type":"ContainerStarted","Data":"7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda"} Apr 16 17:59:05.056573 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:05.056289 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:05.094979 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:05.094936 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" podStartSLOduration=3.094918967 podStartE2EDuration="3.094918967s" podCreationTimestamp="2026-04-16 17:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:59:05.093430969 +0000 UTC m=+1118.467605399" watchObservedRunningTime="2026-04-16 17:59:05.094918967 +0000 UTC m=+1118.469093395" Apr 16 17:59:12.788258 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:12.788213 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:12.788258 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:12.788247 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:12.790888 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:12.790864 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:13.086917 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:13.086833 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 17:59:34.090496 ip-10-0-143-216 kubenswrapper[2580]: I0416 17:59:34.090464 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 18:00:27.436064 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:00:27.436030 2580 scope.go:117] "RemoveContainer" containerID="1daddb9a27057108312e77bd3279f14e4f81f64071f14cae15a975f11007928a" Apr 16 18:00:54.258625 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:00:54.258588 2580 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" secret="" err="secret \"stop-feature-test-epp-sa-dockercfg-2ks6n\" not found" Apr 16 18:00:54.449905 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:00:54.449658 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:00:54.449905 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:00:54.449736 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs podName:1cb36fb0-8259-4a7d-ac8e-d627ff4b1013 nodeName:}" failed. No retries permitted until 2026-04-16 18:00:54.949715884 +0000 UTC m=+1228.323890297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs") pod "stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:00:54.955130 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:00:54.955095 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:00:54.955335 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:00:54.955223 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs podName:1cb36fb0-8259-4a7d-ac8e-d627ff4b1013 nodeName:}" failed. No retries permitted until 2026-04-16 18:00:55.955197799 +0000 UTC m=+1229.329372211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs") pod "stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:00:55.962297 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:00:55.962259 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:00:55.962687 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:00:55.962331 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs podName:1cb36fb0-8259-4a7d-ac8e-d627ff4b1013 nodeName:}" failed. No retries permitted until 2026-04-16 18:00:57.96231634 +0000 UTC m=+1231.336490746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs") pod "stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:00:57.978781 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:00:57.978737 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:00:57.979312 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:00:57.978818 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs podName:1cb36fb0-8259-4a7d-ac8e-d627ff4b1013 nodeName:}" failed. No retries permitted until 2026-04-16 18:01:01.978800639 +0000 UTC m=+1235.352975045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs") pod "stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:01:02.017431 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:01:02.017396 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:01:02.017890 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:01:02.017471 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs podName:1cb36fb0-8259-4a7d-ac8e-d627ff4b1013 nodeName:}" failed. No retries permitted until 2026-04-16 18:01:10.017455876 +0000 UTC m=+1243.391630281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs") pod "stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 18:01:08.795192 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:08.792599 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7"] Apr 16 18:01:08.795192 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:08.793310 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="main" containerID="cri-o://7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda" gracePeriod=30 Apr 16 18:01:08.795192 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:08.794095 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="tokenizer" containerID="cri-o://4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd" gracePeriod=30 Apr 16 18:01:09.494440 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:09.494407 2580 generic.go:358] "Generic (PLEG): container finished" podID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerID="7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda" exitCode=0 Apr 16 18:01:09.494610 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:09.494451 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" event={"ID":"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013","Type":"ContainerDied","Data":"7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda"} Apr 16 18:01:10.046470 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.046446 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 18:01:10.094493 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.094461 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb8dg\" (UniqueName: \"kubernetes.io/projected/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kube-api-access-pb8dg\") pod \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " Apr 16 18:01:10.094697 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.094518 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-cache\") pod \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " Apr 16 18:01:10.094697 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.094586 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-tmp\") pod \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " Apr 16 18:01:10.094697 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.094648 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kserve-provision-location\") pod \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " Apr 16 18:01:10.094697 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.094690 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs\") pod \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " Apr 16 18:01:10.094914 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.094718 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-uds\") pod \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\" (UID: \"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013\") " Apr 16 18:01:10.094914 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.094799 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:01:10.095026 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.094935 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:01:10.095118 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.095092 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:01:10.095118 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.095107 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:01:10.095358 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.095130 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-tmp\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:01:10.095767 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.095736 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:01:10.096837 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.096817 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kube-api-access-pb8dg" (OuterVolumeSpecName: "kube-api-access-pb8dg") pod "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013"). InnerVolumeSpecName "kube-api-access-pb8dg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:01:10.096945 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.096921 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" (UID: "1cb36fb0-8259-4a7d-ac8e-d627ff4b1013"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:01:10.196152 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.196116 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:01:10.196152 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.196146 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-tokenizer-uds\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:01:10.196152 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.196180 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pb8dg\" (UniqueName: \"kubernetes.io/projected/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kube-api-access-pb8dg\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:01:10.196407 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.196193 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:01:10.499145 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.499112 2580 generic.go:358] "Generic (PLEG): container finished" podID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerID="4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd" exitCode=0 Apr 16 18:01:10.499326 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.499199 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" event={"ID":"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013","Type":"ContainerDied","Data":"4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd"} Apr 16 18:01:10.499326 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.499227 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" Apr 16 18:01:10.499326 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.499246 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7" event={"ID":"1cb36fb0-8259-4a7d-ac8e-d627ff4b1013","Type":"ContainerDied","Data":"4f65829082b5997292fc383274aa51b38c1a93c5d90ec56f331438c4127d3c16"} Apr 16 18:01:10.499326 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.499262 2580 scope.go:117] "RemoveContainer" containerID="4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd" Apr 16 18:01:10.508312 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.508296 2580 scope.go:117] "RemoveContainer" containerID="7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda" Apr 16 18:01:10.515686 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.515670 2580 scope.go:117] "RemoveContainer" containerID="02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590" Apr 16 18:01:10.522612 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.522590 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7"] Apr 16 18:01:10.524016 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.524002 2580 scope.go:117] "RemoveContainer" containerID="4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd" Apr 16 18:01:10.524299 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:01:10.524281 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd\": container with ID starting with 4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd not found: ID does not exist" containerID="4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd" Apr 16 18:01:10.524352 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.524310 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd"} err="failed to get container status \"4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd\": rpc error: code = NotFound desc = could not find container \"4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd\": container with ID starting with 4c881644b2b5a23ac30982988125b06772f0e1c34c2c75a016f00dc781da65fd not found: ID does not exist" Apr 16 18:01:10.524352 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.524329 2580 scope.go:117] "RemoveContainer" containerID="7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda" Apr 16 18:01:10.524545 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:01:10.524527 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda\": container with ID starting with 7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda not found: ID does not exist" containerID="7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda" Apr 16 18:01:10.524609 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.524555 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda"} err="failed to get container status \"7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda\": rpc error: code = NotFound desc = could not find container \"7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda\": container with ID starting with 7f156445abbf4a5b7916f5f004700e2acb27f84f82ee308ee2201cd87e588fda not found: ID does not exist" Apr 16 18:01:10.524609 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.524579 2580 scope.go:117] "RemoveContainer" containerID="02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590" Apr 16 18:01:10.524846 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:01:10.524822 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590\": container with ID starting with 02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590 not found: ID does not exist" containerID="02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590" Apr 16 18:01:10.524939 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.524846 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590"} err="failed to get container status \"02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590\": rpc error: code = NotFound desc = could not find container \"02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590\": container with ID starting with 02b02e4f306c8b310c963fe76f2c5fd2492724559d7fb91e259b15f7f0028590 not found: ID does not exist" Apr 16 18:01:10.527587 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:10.527568 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-jhtm7"] Apr 16 18:01:11.264018 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:11.263987 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" path="/var/lib/kubelet/pods/1cb36fb0-8259-4a7d-ac8e-d627ff4b1013/volumes" Apr 16 18:01:21.927259 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927203 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn"] Apr 16 18:01:21.928746 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927794 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="storage-initializer" Apr 16 18:01:21.928746 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927812 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="storage-initializer" Apr 16 18:01:21.928746 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927825 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="main" Apr 16 18:01:21.928746 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927833 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="main" Apr 16 18:01:21.928746 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927866 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="tokenizer" Apr 16 18:01:21.928746 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927876 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="tokenizer" Apr 16 18:01:21.928746 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927973 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="main" Apr 16 18:01:21.928746 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.927987 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cb36fb0-8259-4a7d-ac8e-d627ff4b1013" containerName="tokenizer" Apr 16 18:01:21.930641 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.930619 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:21.933119 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.933096 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 18:01:21.933119 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.933116 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:01:21.933834 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.933644 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-gf8cw\"" Apr 16 18:01:21.933834 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.933663 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:01:21.933834 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.933666 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 18:01:21.941855 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:21.941834 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn"] Apr 16 18:01:22.004309 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.004276 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5mh9\" (UniqueName: \"kubernetes.io/projected/3a919b90-fcba-46ea-8fed-90d226d0546a-kube-api-access-s5mh9\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.004309 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.004311 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.004561 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.004338 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.004561 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.004442 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a919b90-fcba-46ea-8fed-90d226d0546a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.004561 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.004471 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.004708 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.004610 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.105844 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.105809 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a919b90-fcba-46ea-8fed-90d226d0546a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.105844 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.105844 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.106038 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.105898 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.106038 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.105920 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5mh9\" (UniqueName: \"kubernetes.io/projected/3a919b90-fcba-46ea-8fed-90d226d0546a-kube-api-access-s5mh9\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.106038 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.105937 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.106038 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.105973 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.106424 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.106402 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.106491 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.106419 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.106491 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.106464 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.106770 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.106492 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.108480 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.108459 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a919b90-fcba-46ea-8fed-90d226d0546a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.119598 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.119573 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5mh9\" (UniqueName: \"kubernetes.io/projected/3a919b90-fcba-46ea-8fed-90d226d0546a-kube-api-access-s5mh9\") pod \"stop-feature-test-kserve-router-scheduler-6954779799-rstsn\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.242528 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.242440 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:22.372240 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.372217 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn"] Apr 16 18:01:22.374271 ip-10-0-143-216 kubenswrapper[2580]: W0416 18:01:22.374243 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a919b90_fcba_46ea_8fed_90d226d0546a.slice/crio-d67931d67f1f9c37e77d9206b44d3d904c75f81f216bdbdc3ee3f3b4993e6e3b WatchSource:0}: Error finding container d67931d67f1f9c37e77d9206b44d3d904c75f81f216bdbdc3ee3f3b4993e6e3b: Status 404 returned error can't find the container with id d67931d67f1f9c37e77d9206b44d3d904c75f81f216bdbdc3ee3f3b4993e6e3b Apr 16 18:01:22.542704 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.542621 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" event={"ID":"3a919b90-fcba-46ea-8fed-90d226d0546a","Type":"ContainerStarted","Data":"35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176"} Apr 16 18:01:22.542704 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:22.542658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" event={"ID":"3a919b90-fcba-46ea-8fed-90d226d0546a","Type":"ContainerStarted","Data":"d67931d67f1f9c37e77d9206b44d3d904c75f81f216bdbdc3ee3f3b4993e6e3b"} Apr 16 18:01:23.547721 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:23.547684 2580 generic.go:358] "Generic (PLEG): container finished" podID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerID="35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176" exitCode=0 Apr 16 18:01:23.548090 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:23.547765 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" event={"ID":"3a919b90-fcba-46ea-8fed-90d226d0546a","Type":"ContainerDied","Data":"35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176"} Apr 16 18:01:24.556552 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:24.556516 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" event={"ID":"3a919b90-fcba-46ea-8fed-90d226d0546a","Type":"ContainerStarted","Data":"20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079"} Apr 16 18:01:24.556552 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:24.556555 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" event={"ID":"3a919b90-fcba-46ea-8fed-90d226d0546a","Type":"ContainerStarted","Data":"b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0"} Apr 16 18:01:24.557017 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:24.556708 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:24.597873 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:24.597817 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" podStartSLOduration=3.597799661 podStartE2EDuration="3.597799661s" podCreationTimestamp="2026-04-16 18:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:01:24.595486249 +0000 UTC m=+1257.969660677" watchObservedRunningTime="2026-04-16 18:01:24.597799661 +0000 UTC m=+1257.971974088" Apr 16 18:01:32.242732 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:32.242695 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:32.243183 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:32.242857 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:32.245601 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:32.245580 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:32.585841 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:32.585755 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:01:54.593997 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:01:54.593968 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:03:13.138496 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:13.138458 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn"] Apr 16 18:03:13.138996 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:13.138866 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="main" containerID="cri-o://b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0" gracePeriod=30 Apr 16 18:03:13.138996 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:13.138905 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="tokenizer" containerID="cri-o://20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079" gracePeriod=30 Apr 16 18:03:13.942057 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:13.942019 2580 generic.go:358] "Generic (PLEG): container finished" podID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerID="b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0" exitCode=0 Apr 16 18:03:13.942244 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:13.942059 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" event={"ID":"3a919b90-fcba-46ea-8fed-90d226d0546a","Type":"ContainerDied","Data":"b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0"} Apr 16 18:03:14.399338 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.399318 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:03:14.512235 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512127 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-kserve-provision-location\") pod \"3a919b90-fcba-46ea-8fed-90d226d0546a\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " Apr 16 18:03:14.512235 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512186 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a919b90-fcba-46ea-8fed-90d226d0546a-tls-certs\") pod \"3a919b90-fcba-46ea-8fed-90d226d0546a\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " Apr 16 18:03:14.512235 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512234 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5mh9\" (UniqueName: \"kubernetes.io/projected/3a919b90-fcba-46ea-8fed-90d226d0546a-kube-api-access-s5mh9\") pod \"3a919b90-fcba-46ea-8fed-90d226d0546a\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " Apr 16 18:03:14.512510 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512259 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-tmp\") pod \"3a919b90-fcba-46ea-8fed-90d226d0546a\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " Apr 16 18:03:14.512510 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512282 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-uds\") pod \"3a919b90-fcba-46ea-8fed-90d226d0546a\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " Apr 16 18:03:14.512510 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512309 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-cache\") pod \"3a919b90-fcba-46ea-8fed-90d226d0546a\" (UID: \"3a919b90-fcba-46ea-8fed-90d226d0546a\") " Apr 16 18:03:14.512662 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512552 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3a919b90-fcba-46ea-8fed-90d226d0546a" (UID: "3a919b90-fcba-46ea-8fed-90d226d0546a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:14.512662 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512626 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3a919b90-fcba-46ea-8fed-90d226d0546a" (UID: "3a919b90-fcba-46ea-8fed-90d226d0546a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:14.512768 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512718 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3a919b90-fcba-46ea-8fed-90d226d0546a" (UID: "3a919b90-fcba-46ea-8fed-90d226d0546a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:14.512970 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.512943 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3a919b90-fcba-46ea-8fed-90d226d0546a" (UID: "3a919b90-fcba-46ea-8fed-90d226d0546a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:14.514578 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.514546 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a919b90-fcba-46ea-8fed-90d226d0546a-kube-api-access-s5mh9" (OuterVolumeSpecName: "kube-api-access-s5mh9") pod "3a919b90-fcba-46ea-8fed-90d226d0546a" (UID: "3a919b90-fcba-46ea-8fed-90d226d0546a"). InnerVolumeSpecName "kube-api-access-s5mh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:03:14.514687 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.514553 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a919b90-fcba-46ea-8fed-90d226d0546a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3a919b90-fcba-46ea-8fed-90d226d0546a" (UID: "3a919b90-fcba-46ea-8fed-90d226d0546a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:03:14.613519 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.613478 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3a919b90-fcba-46ea-8fed-90d226d0546a-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:03:14.613519 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.613510 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5mh9\" (UniqueName: \"kubernetes.io/projected/3a919b90-fcba-46ea-8fed-90d226d0546a-kube-api-access-s5mh9\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:03:14.613519 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.613521 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-tmp\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:03:14.613519 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.613531 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-uds\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:03:14.613785 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.613539 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-tokenizer-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:03:14.613785 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.613547 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3a919b90-fcba-46ea-8fed-90d226d0546a-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:03:14.948553 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.948519 2580 generic.go:358] "Generic (PLEG): container finished" podID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerID="20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079" exitCode=0 Apr 16 18:03:14.948718 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.948602 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" Apr 16 18:03:14.948718 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.948602 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" event={"ID":"3a919b90-fcba-46ea-8fed-90d226d0546a","Type":"ContainerDied","Data":"20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079"} Apr 16 18:03:14.948718 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.948645 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn" event={"ID":"3a919b90-fcba-46ea-8fed-90d226d0546a","Type":"ContainerDied","Data":"d67931d67f1f9c37e77d9206b44d3d904c75f81f216bdbdc3ee3f3b4993e6e3b"} Apr 16 18:03:14.948718 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.948666 2580 scope.go:117] "RemoveContainer" containerID="20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079" Apr 16 18:03:14.957853 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.957835 2580 scope.go:117] "RemoveContainer" containerID="b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0" Apr 16 18:03:14.965517 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.965499 2580 scope.go:117] "RemoveContainer" containerID="35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176" Apr 16 18:03:14.972919 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.972900 2580 scope.go:117] "RemoveContainer" containerID="20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079" Apr 16 18:03:14.973180 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:03:14.973140 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079\": container with ID starting with 20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079 not found: ID does not exist" containerID="20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079" Apr 16 18:03:14.973260 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.973198 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079"} err="failed to get container status \"20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079\": rpc error: code = NotFound desc = could not find container \"20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079\": container with ID starting with 20e9782ae6acb2c769d9fb742405499b1fc1c4c60d1c83d279e44b92fe08e079 not found: ID does not exist" Apr 16 18:03:14.973260 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.973227 2580 scope.go:117] "RemoveContainer" containerID="b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0" Apr 16 18:03:14.973472 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:03:14.973454 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0\": container with ID starting with b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0 not found: ID does not exist" containerID="b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0" Apr 16 18:03:14.973511 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.973482 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0"} err="failed to get container status \"b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0\": rpc error: code = NotFound desc = could not find container \"b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0\": container with ID starting with b1dac3ecf494206845860d774473db578f593afbcb179a0cd693044b3888fdb0 not found: ID does not exist" Apr 16 18:03:14.973511 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.973500 2580 scope.go:117] "RemoveContainer" containerID="35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176" Apr 16 18:03:14.973716 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:03:14.973699 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176\": container with ID starting with 35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176 not found: ID does not exist" containerID="35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176" Apr 16 18:03:14.973759 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.973723 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176"} err="failed to get container status \"35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176\": rpc error: code = NotFound desc = could not find container \"35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176\": container with ID starting with 35d8bcff764fffc5b1170e61106daef99b6fd6a3f4221f4be733935a9a3d7176 not found: ID does not exist" Apr 16 18:03:14.980592 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.980573 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn"] Apr 16 18:03:14.985244 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:14.985223 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-6954779799-rstsn"] Apr 16 18:03:15.263104 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.263024 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" path="/var/lib/kubelet/pods/3a919b90-fcba-46ea-8fed-90d226d0546a/volumes" Apr 16 18:03:15.382653 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.382613 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5578b86f79-4x5xj"] Apr 16 18:03:15.383006 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.382994 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="main" Apr 16 18:03:15.383055 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.383009 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="main" Apr 16 18:03:15.383055 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.383018 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="tokenizer" Apr 16 18:03:15.383055 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.383024 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="tokenizer" Apr 16 18:03:15.383055 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.383040 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="storage-initializer" Apr 16 18:03:15.383055 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.383047 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="storage-initializer" Apr 16 18:03:15.383267 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.383105 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="main" Apr 16 18:03:15.383267 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.383115 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a919b90-fcba-46ea-8fed-90d226d0546a" containerName="tokenizer" Apr 16 18:03:15.387818 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.387796 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:15.389820 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.389796 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-94czs\"" Apr 16 18:03:15.390400 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.390386 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:03:15.395421 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.395398 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5578b86f79-4x5xj"] Apr 16 18:03:15.422304 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.422276 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pch\" (UniqueName: \"kubernetes.io/projected/a1620162-a9d2-4631-977e-38406f321739-kube-api-access-m5pch\") pod \"llmisvc-controller-manager-5578b86f79-4x5xj\" (UID: \"a1620162-a9d2-4631-977e-38406f321739\") " pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:15.422651 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.422340 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1620162-a9d2-4631-977e-38406f321739-cert\") pod \"llmisvc-controller-manager-5578b86f79-4x5xj\" (UID: \"a1620162-a9d2-4631-977e-38406f321739\") " pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:15.522990 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.522899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1620162-a9d2-4631-977e-38406f321739-cert\") pod \"llmisvc-controller-manager-5578b86f79-4x5xj\" (UID: \"a1620162-a9d2-4631-977e-38406f321739\") " pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:15.522990 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.522970 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5pch\" (UniqueName: \"kubernetes.io/projected/a1620162-a9d2-4631-977e-38406f321739-kube-api-access-m5pch\") pod \"llmisvc-controller-manager-5578b86f79-4x5xj\" (UID: \"a1620162-a9d2-4631-977e-38406f321739\") " pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:15.525461 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.525433 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1620162-a9d2-4631-977e-38406f321739-cert\") pod \"llmisvc-controller-manager-5578b86f79-4x5xj\" (UID: \"a1620162-a9d2-4631-977e-38406f321739\") " pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:15.532313 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.532287 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5pch\" (UniqueName: \"kubernetes.io/projected/a1620162-a9d2-4631-977e-38406f321739-kube-api-access-m5pch\") pod \"llmisvc-controller-manager-5578b86f79-4x5xj\" (UID: \"a1620162-a9d2-4631-977e-38406f321739\") " pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:15.698507 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.698470 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:15.827675 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.827653 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5578b86f79-4x5xj"] Apr 16 18:03:15.829871 ip-10-0-143-216 kubenswrapper[2580]: W0416 18:03:15.829846 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda1620162_a9d2_4631_977e_38406f321739.slice/crio-c157a95aa1d3710c1f8a6e41b97dbb178659e42207668962129aaf4bdb040bbd WatchSource:0}: Error finding container c157a95aa1d3710c1f8a6e41b97dbb178659e42207668962129aaf4bdb040bbd: Status 404 returned error can't find the container with id c157a95aa1d3710c1f8a6e41b97dbb178659e42207668962129aaf4bdb040bbd Apr 16 18:03:15.953010 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:15.952974 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" event={"ID":"a1620162-a9d2-4631-977e-38406f321739","Type":"ContainerStarted","Data":"c157a95aa1d3710c1f8a6e41b97dbb178659e42207668962129aaf4bdb040bbd"} Apr 16 18:03:19.970614 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:19.970571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" event={"ID":"a1620162-a9d2-4631-977e-38406f321739","Type":"ContainerStarted","Data":"d57a554547a021c822f731a7d476c5621b496cdee403fe1de9ec344a8636ac8e"} Apr 16 18:03:19.971006 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:19.970705 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:03:19.990810 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:19.990761 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" podStartSLOduration=1.477174794 podStartE2EDuration="4.990746837s" podCreationTimestamp="2026-04-16 18:03:15 +0000 UTC" firstStartedPulling="2026-04-16 18:03:15.831072123 +0000 UTC m=+1369.205246529" lastFinishedPulling="2026-04-16 18:03:19.344644163 +0000 UTC m=+1372.718818572" observedRunningTime="2026-04-16 18:03:19.988545663 +0000 UTC m=+1373.362720091" watchObservedRunningTime="2026-04-16 18:03:19.990746837 +0000 UTC m=+1373.364921265" Apr 16 18:03:50.976405 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:03:50.976370 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5578b86f79-4x5xj" Apr 16 18:08:01.909107 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.909071 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:08:01.912309 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.912286 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:01.914584 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.914560 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:08:01.915341 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.915323 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:08:01.915419 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.915394 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 18:08:01.915476 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.915398 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 18:08:01.915476 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.915399 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-4mcrc\"" Apr 16 18:08:01.926541 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.926515 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:08:01.952478 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.952325 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv"] Apr 16 18:08:01.955694 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.955669 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:01.957791 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.957773 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-gq6j6\"" Apr 16 18:08:01.966898 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:01.966876 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv"] Apr 16 18:08:02.060974 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.060938 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.060974 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.060974 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phvnr\" (UniqueName: \"kubernetes.io/projected/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kube-api-access-phvnr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.060996 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061017 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061041 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061087 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061104 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dsk\" (UniqueName: \"kubernetes.io/projected/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kube-api-access-27dsk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061231 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061248 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.061265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061266 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.061578 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.061283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.162439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162348 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.162439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phvnr\" (UniqueName: \"kubernetes.io/projected/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kube-api-access-phvnr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.162439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162409 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.162663 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162460 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.162663 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162493 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.162663 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162546 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.162663 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162588 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27dsk\" (UniqueName: \"kubernetes.io/projected/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kube-api-access-27dsk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.162852 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162720 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.162852 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162815 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.162957 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162861 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.162957 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162861 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.162957 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162887 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.162957 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.163211 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.162968 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.163211 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.163007 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.163211 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.163010 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.163211 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.163115 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.163439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.163246 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.163439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.163318 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.165445 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.165419 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.165982 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.165959 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.166480 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.166458 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.171381 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.171352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dsk\" (UniqueName: \"kubernetes.io/projected/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kube-api-access-27dsk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.171660 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.171641 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phvnr\" (UniqueName: \"kubernetes.io/projected/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kube-api-access-phvnr\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.222812 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.222776 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:08:02.266807 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.266773 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:02.363396 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.363370 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:08:02.365088 ip-10-0-143-216 kubenswrapper[2580]: W0416 18:08:02.365051 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b8b723_36f3_4afb_ac24_607f9dd1c79a.slice/crio-96c82299590481865496bd46ae1aefb46adf48ce53e34f51e72bfbf39def8e83 WatchSource:0}: Error finding container 96c82299590481865496bd46ae1aefb46adf48ce53e34f51e72bfbf39def8e83: Status 404 returned error can't find the container with id 96c82299590481865496bd46ae1aefb46adf48ce53e34f51e72bfbf39def8e83 Apr 16 18:08:02.367795 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.367774 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:08:02.412044 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.412013 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv"] Apr 16 18:08:02.426329 ip-10-0-143-216 kubenswrapper[2580]: W0416 18:08:02.426299 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39f57fcf_2549_4d3a_b563_9cc8efef62d9.slice/crio-25c607440c9e7d664cfc03a66cfb13ded3234ba096314132c2af62a278f7da15 WatchSource:0}: Error finding container 25c607440c9e7d664cfc03a66cfb13ded3234ba096314132c2af62a278f7da15: Status 404 returned error can't find the container with id 25c607440c9e7d664cfc03a66cfb13ded3234ba096314132c2af62a278f7da15 Apr 16 18:08:02.967973 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.967832 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" event={"ID":"39f57fcf-2549-4d3a-b563-9cc8efef62d9","Type":"ContainerStarted","Data":"e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672"} Apr 16 18:08:02.967973 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.967972 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" event={"ID":"39f57fcf-2549-4d3a-b563-9cc8efef62d9","Type":"ContainerStarted","Data":"25c607440c9e7d664cfc03a66cfb13ded3234ba096314132c2af62a278f7da15"} Apr 16 18:08:02.969668 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.969636 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"f9b8b723-36f3-4afb-ac24-607f9dd1c79a","Type":"ContainerStarted","Data":"8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d"} Apr 16 18:08:02.969668 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:02.969674 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"f9b8b723-36f3-4afb-ac24-607f9dd1c79a","Type":"ContainerStarted","Data":"96c82299590481865496bd46ae1aefb46adf48ce53e34f51e72bfbf39def8e83"} Apr 16 18:08:03.975773 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:03.975740 2580 generic.go:358] "Generic (PLEG): container finished" podID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerID="e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672" exitCode=0 Apr 16 18:08:03.976130 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:03.975832 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" event={"ID":"39f57fcf-2549-4d3a-b563-9cc8efef62d9","Type":"ContainerDied","Data":"e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672"} Apr 16 18:08:04.985678 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:04.985639 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" event={"ID":"39f57fcf-2549-4d3a-b563-9cc8efef62d9","Type":"ContainerStarted","Data":"c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3"} Apr 16 18:08:04.986081 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:04.985686 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" event={"ID":"39f57fcf-2549-4d3a-b563-9cc8efef62d9","Type":"ContainerStarted","Data":"f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0"} Apr 16 18:08:04.986081 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:04.985796 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:05.011351 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:05.011287 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" podStartSLOduration=4.011268952 podStartE2EDuration="4.011268952s" podCreationTimestamp="2026-04-16 18:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:08:05.009250525 +0000 UTC m=+1658.383424949" watchObservedRunningTime="2026-04-16 18:08:05.011268952 +0000 UTC m=+1658.385443385" Apr 16 18:08:06.995426 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:06.995396 2580 generic.go:358] "Generic (PLEG): container finished" podID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" containerID="8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d" exitCode=0 Apr 16 18:08:06.995765 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:06.995441 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"f9b8b723-36f3-4afb-ac24-607f9dd1c79a","Type":"ContainerDied","Data":"8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d"} Apr 16 18:08:12.269310 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:12.268602 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:12.269310 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:12.268639 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:12.269310 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:12.268963 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.42:8082/healthz\": dial tcp 10.134.0.42:8082: connect: connection refused" Apr 16 18:08:22.269265 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:22.269222 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:22.270597 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:22.270571 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:08:35.127964 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:35.127923 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"f9b8b723-36f3-4afb-ac24-607f9dd1c79a","Type":"ContainerStarted","Data":"c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b"} Apr 16 18:08:35.156004 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:35.155942 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.56183364 podStartE2EDuration="34.155921861s" podCreationTimestamp="2026-04-16 18:08:01 +0000 UTC" firstStartedPulling="2026-04-16 18:08:06.996659839 +0000 UTC m=+1660.370834249" lastFinishedPulling="2026-04-16 18:08:34.590748062 +0000 UTC m=+1687.964922470" observedRunningTime="2026-04-16 18:08:35.153878072 +0000 UTC m=+1688.528052500" watchObservedRunningTime="2026-04-16 18:08:35.155921861 +0000 UTC m=+1688.530096290" Apr 16 18:08:43.072625 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:08:43.072549 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:11:08.353084 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:08.353051 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv"] Apr 16 18:11:08.353653 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:08.353401 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="main" containerID="cri-o://f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0" gracePeriod=30 Apr 16 18:11:08.354093 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:08.353808 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="tokenizer" containerID="cri-o://c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3" gracePeriod=30 Apr 16 18:11:08.689866 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:08.689836 2580 generic.go:358] "Generic (PLEG): container finished" podID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerID="f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0" exitCode=0 Apr 16 18:11:08.690042 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:08.689915 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" event={"ID":"39f57fcf-2549-4d3a-b563-9cc8efef62d9","Type":"ContainerDied","Data":"f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0"} Apr 16 18:11:09.609992 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.609971 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:11:09.664722 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.664686 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phvnr\" (UniqueName: \"kubernetes.io/projected/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kube-api-access-phvnr\") pod \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " Apr 16 18:11:09.664917 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.664765 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-uds\") pod \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " Apr 16 18:11:09.664917 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.664794 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-tmp\") pod \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " Apr 16 18:11:09.664917 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.664850 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kserve-provision-location\") pod \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " Apr 16 18:11:09.664917 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.664878 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tls-certs\") pod \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " Apr 16 18:11:09.665129 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.664927 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-cache\") pod \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\" (UID: \"39f57fcf-2549-4d3a-b563-9cc8efef62d9\") " Apr 16 18:11:09.665129 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.665059 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "39f57fcf-2549-4d3a-b563-9cc8efef62d9" (UID: "39f57fcf-2549-4d3a-b563-9cc8efef62d9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:09.665282 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.665187 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "39f57fcf-2549-4d3a-b563-9cc8efef62d9" (UID: "39f57fcf-2549-4d3a-b563-9cc8efef62d9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:09.665343 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.665319 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "39f57fcf-2549-4d3a-b563-9cc8efef62d9" (UID: "39f57fcf-2549-4d3a-b563-9cc8efef62d9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:09.665398 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.665378 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-uds\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:09.665446 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.665405 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-tmp\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:09.665784 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.665714 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "39f57fcf-2549-4d3a-b563-9cc8efef62d9" (UID: "39f57fcf-2549-4d3a-b563-9cc8efef62d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:09.667217 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.667130 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kube-api-access-phvnr" (OuterVolumeSpecName: "kube-api-access-phvnr") pod "39f57fcf-2549-4d3a-b563-9cc8efef62d9" (UID: "39f57fcf-2549-4d3a-b563-9cc8efef62d9"). InnerVolumeSpecName "kube-api-access-phvnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:09.667531 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.667483 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "39f57fcf-2549-4d3a-b563-9cc8efef62d9" (UID: "39f57fcf-2549-4d3a-b563-9cc8efef62d9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:09.695873 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.695780 2580 generic.go:358] "Generic (PLEG): container finished" podID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerID="c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3" exitCode=0 Apr 16 18:11:09.696027 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.695874 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" Apr 16 18:11:09.696027 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.695874 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" event={"ID":"39f57fcf-2549-4d3a-b563-9cc8efef62d9","Type":"ContainerDied","Data":"c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3"} Apr 16 18:11:09.696027 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.695914 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv" event={"ID":"39f57fcf-2549-4d3a-b563-9cc8efef62d9","Type":"ContainerDied","Data":"25c607440c9e7d664cfc03a66cfb13ded3234ba096314132c2af62a278f7da15"} Apr 16 18:11:09.696027 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.695930 2580 scope.go:117] "RemoveContainer" containerID="c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3" Apr 16 18:11:09.705367 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.705350 2580 scope.go:117] "RemoveContainer" containerID="f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0" Apr 16 18:11:09.713637 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.713618 2580 scope.go:117] "RemoveContainer" containerID="e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672" Apr 16 18:11:09.722072 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.722051 2580 scope.go:117] "RemoveContainer" containerID="c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3" Apr 16 18:11:09.722396 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:09.722374 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3\": container with ID starting with c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3 not found: ID does not exist" containerID="c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3" Apr 16 18:11:09.722487 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.722412 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3"} err="failed to get container status \"c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3\": rpc error: code = NotFound desc = could not find container \"c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3\": container with ID starting with c277545cf387f58c1ed1849b1126221c647a78cd3254cb531afb4fbfffb878f3 not found: ID does not exist" Apr 16 18:11:09.722487 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.722439 2580 scope.go:117] "RemoveContainer" containerID="f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0" Apr 16 18:11:09.722726 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:09.722706 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0\": container with ID starting with f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0 not found: ID does not exist" containerID="f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0" Apr 16 18:11:09.722793 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.722735 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0"} err="failed to get container status \"f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0\": rpc error: code = NotFound desc = could not find container \"f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0\": container with ID starting with f1dcb20d06001046c03b32236ca98e2fa8533f9a0f7306ed00f9139adc2a56e0 not found: ID does not exist" Apr 16 18:11:09.722793 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.722753 2580 scope.go:117] "RemoveContainer" containerID="e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672" Apr 16 18:11:09.723009 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:09.722994 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672\": container with ID starting with e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672 not found: ID does not exist" containerID="e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672" Apr 16 18:11:09.723057 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.723014 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672"} err="failed to get container status \"e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672\": rpc error: code = NotFound desc = could not find container \"e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672\": container with ID starting with e225d5626f54f3fdc215c414f592e65e01858e33103843f8292366df9a9c4672 not found: ID does not exist" Apr 16 18:11:09.723909 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.723888 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv"] Apr 16 18:11:09.728098 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.728077 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheqbsrv"] Apr 16 18:11:09.766370 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.766338 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:09.766370 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.766363 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:09.766370 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.766376 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/39f57fcf-2549-4d3a-b563-9cc8efef62d9-tokenizer-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:09.766575 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:09.766386 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phvnr\" (UniqueName: \"kubernetes.io/projected/39f57fcf-2549-4d3a-b563-9cc8efef62d9-kube-api-access-phvnr\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:10.183347 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:10.183309 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:11:10.183621 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:10.183598 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" containerName="main" containerID="cri-o://c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b" gracePeriod=30 Apr 16 18:11:11.029673 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.029651 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:11:11.180737 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.180703 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-home\") pod \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " Apr 16 18:11:11.180922 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.180754 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27dsk\" (UniqueName: \"kubernetes.io/projected/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kube-api-access-27dsk\") pod \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " Apr 16 18:11:11.180922 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.180791 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kserve-provision-location\") pod \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " Apr 16 18:11:11.180922 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.180864 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-tls-certs\") pod \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " Apr 16 18:11:11.180922 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.180887 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-dshm\") pod \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " Apr 16 18:11:11.180922 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.180910 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-model-cache\") pod \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\" (UID: \"f9b8b723-36f3-4afb-ac24-607f9dd1c79a\") " Apr 16 18:11:11.181278 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.181250 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-home" (OuterVolumeSpecName: "home") pod "f9b8b723-36f3-4afb-ac24-607f9dd1c79a" (UID: "f9b8b723-36f3-4afb-ac24-607f9dd1c79a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:11.181278 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.181261 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-model-cache" (OuterVolumeSpecName: "model-cache") pod "f9b8b723-36f3-4afb-ac24-607f9dd1c79a" (UID: "f9b8b723-36f3-4afb-ac24-607f9dd1c79a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:11.183275 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.183249 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-dshm" (OuterVolumeSpecName: "dshm") pod "f9b8b723-36f3-4afb-ac24-607f9dd1c79a" (UID: "f9b8b723-36f3-4afb-ac24-607f9dd1c79a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:11.183364 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.183249 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f9b8b723-36f3-4afb-ac24-607f9dd1c79a" (UID: "f9b8b723-36f3-4afb-ac24-607f9dd1c79a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:11.183364 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.183324 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kube-api-access-27dsk" (OuterVolumeSpecName: "kube-api-access-27dsk") pod "f9b8b723-36f3-4afb-ac24-607f9dd1c79a" (UID: "f9b8b723-36f3-4afb-ac24-607f9dd1c79a"). InnerVolumeSpecName "kube-api-access-27dsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:11.245285 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.245235 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f9b8b723-36f3-4afb-ac24-607f9dd1c79a" (UID: "f9b8b723-36f3-4afb-ac24-607f9dd1c79a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:11.264723 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.264694 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" path="/var/lib/kubelet/pods/39f57fcf-2549-4d3a-b563-9cc8efef62d9/volumes" Apr 16 18:11:11.282244 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.282210 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:11.282244 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.282244 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-dshm\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:11.282244 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.282254 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-model-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:11.282439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.282262 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-home\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:11.282439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.282271 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-27dsk\" (UniqueName: \"kubernetes.io/projected/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kube-api-access-27dsk\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:11.282439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.282281 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9b8b723-36f3-4afb-ac24-607f9dd1c79a-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:11.706075 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.706045 2580 generic.go:358] "Generic (PLEG): container finished" podID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" containerID="c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b" exitCode=0 Apr 16 18:11:11.706279 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.706134 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 18:11:11.706279 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.706136 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"f9b8b723-36f3-4afb-ac24-607f9dd1c79a","Type":"ContainerDied","Data":"c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b"} Apr 16 18:11:11.706279 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.706195 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"f9b8b723-36f3-4afb-ac24-607f9dd1c79a","Type":"ContainerDied","Data":"96c82299590481865496bd46ae1aefb46adf48ce53e34f51e72bfbf39def8e83"} Apr 16 18:11:11.706279 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.706213 2580 scope.go:117] "RemoveContainer" containerID="c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b" Apr 16 18:11:11.724712 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.724698 2580 scope.go:117] "RemoveContainer" containerID="8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d" Apr 16 18:11:11.737836 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.737787 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:11:11.740601 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.740575 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 18:11:11.788849 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.788824 2580 scope.go:117] "RemoveContainer" containerID="c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b" Apr 16 18:11:11.789211 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:11.789149 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b\": container with ID starting with c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b not found: ID does not exist" containerID="c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b" Apr 16 18:11:11.789273 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.789222 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b"} err="failed to get container status \"c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b\": rpc error: code = NotFound desc = could not find container \"c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b\": container with ID starting with c0870b6d78582a6ff24bc0104312b1c53c35cf2da0b43ca19b6af3d8b920b73b not found: ID does not exist" Apr 16 18:11:11.789273 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.789243 2580 scope.go:117] "RemoveContainer" containerID="8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d" Apr 16 18:11:11.789536 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:11.789516 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d\": container with ID starting with 8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d not found: ID does not exist" containerID="8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d" Apr 16 18:11:11.789605 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:11.789547 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d"} err="failed to get container status \"8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d\": rpc error: code = NotFound desc = could not find container \"8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d\": container with ID starting with 8a3bdb67f9e7545ffc3ff3d979541408c26772785c6258f5f38739f20cc4ef5d not found: ID does not exist" Apr 16 18:11:13.263226 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:13.263148 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" path="/var/lib/kubelet/pods/f9b8b723-36f3-4afb-ac24-607f9dd1c79a/volumes" Apr 16 18:11:15.845241 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845206 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j"] Apr 16 18:11:15.845645 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845631 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" containerName="storage-initializer" Apr 16 18:11:15.845684 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845648 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" containerName="storage-initializer" Apr 16 18:11:15.845684 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845665 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="main" Apr 16 18:11:15.845684 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845674 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="main" Apr 16 18:11:15.845776 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845687 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" containerName="main" Apr 16 18:11:15.845776 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845693 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" containerName="main" Apr 16 18:11:15.845776 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845704 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="storage-initializer" Apr 16 18:11:15.845776 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845710 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="storage-initializer" Apr 16 18:11:15.845776 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845721 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="tokenizer" Apr 16 18:11:15.845776 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845727 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="tokenizer" Apr 16 18:11:15.845945 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845784 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="tokenizer" Apr 16 18:11:15.845945 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845793 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="39f57fcf-2549-4d3a-b563-9cc8efef62d9" containerName="main" Apr 16 18:11:15.845945 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.845803 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9b8b723-36f3-4afb-ac24-607f9dd1c79a" containerName="main" Apr 16 18:11:15.851018 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.850994 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:15.854417 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.854399 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 18:11:15.855141 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.855117 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:11:15.855274 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.855200 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:11:15.855274 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.855203 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 18:11:15.865310 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.865286 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j"] Apr 16 18:11:15.922369 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.922340 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-model-cache\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:15.922369 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.922373 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/268868a0-2a97-4342-ab55-71325fb837de-tls-certs\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:15.922589 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.922500 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:15.922589 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.922527 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-home\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:15.922589 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.922548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-dshm\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:15.922589 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:15.922563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bgrr\" (UniqueName: \"kubernetes.io/projected/268868a0-2a97-4342-ab55-71325fb837de-kube-api-access-5bgrr\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.023477 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023436 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-model-cache\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.023477 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023475 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/268868a0-2a97-4342-ab55-71325fb837de-tls-certs\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.023724 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023529 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.023724 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023548 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-home\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.023724 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023566 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-dshm\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.023724 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023582 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bgrr\" (UniqueName: \"kubernetes.io/projected/268868a0-2a97-4342-ab55-71325fb837de-kube-api-access-5bgrr\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.023944 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023873 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-model-cache\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.023944 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023898 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-home\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.024036 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.023957 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.026058 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.026026 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-dshm\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.026356 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.026336 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/268868a0-2a97-4342-ab55-71325fb837de-tls-certs\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.035118 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.035091 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bgrr\" (UniqueName: \"kubernetes.io/projected/268868a0-2a97-4342-ab55-71325fb837de-kube-api-access-5bgrr\") pod \"scheduler-inline-config-test-kserve-99b6bd69d-zk74j\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.156347 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.156272 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx"] Apr 16 18:11:16.160511 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.160484 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.161600 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.161580 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:16.163722 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.163705 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-xmtqt\"" Apr 16 18:11:16.177478 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.177449 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx"] Apr 16 18:11:16.226318 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.226285 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q278\" (UniqueName: \"kubernetes.io/projected/fdd0044f-5615-4a72-ad25-969773390953-kube-api-access-8q278\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.226495 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.226471 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.226541 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.226516 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.226579 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.226553 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd0044f-5615-4a72-ad25-969773390953-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.226613 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.226596 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.226648 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.226624 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.315643 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.315619 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j"] Apr 16 18:11:16.317040 ip-10-0-143-216 kubenswrapper[2580]: W0416 18:11:16.317017 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod268868a0_2a97_4342_ab55_71325fb837de.slice/crio-247f2b71b17389064bffb5c4638ff834cc9ed71a01024d80dd944fbe13e2cd97 WatchSource:0}: Error finding container 247f2b71b17389064bffb5c4638ff834cc9ed71a01024d80dd944fbe13e2cd97: Status 404 returned error can't find the container with id 247f2b71b17389064bffb5c4638ff834cc9ed71a01024d80dd944fbe13e2cd97 Apr 16 18:11:16.327907 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.327886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q278\" (UniqueName: \"kubernetes.io/projected/fdd0044f-5615-4a72-ad25-969773390953-kube-api-access-8q278\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.327987 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.327963 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.328055 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.328003 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.328055 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.328022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd0044f-5615-4a72-ad25-969773390953-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.328055 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.328037 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.328243 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.328055 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.328416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.328318 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.328416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.328361 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.328416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.328387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.328564 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.328459 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.330895 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.330873 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd0044f-5615-4a72-ad25-969773390953-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.340612 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.340593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q278\" (UniqueName: \"kubernetes.io/projected/fdd0044f-5615-4a72-ad25-969773390953-kube-api-access-8q278\") pod \"scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.494418 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.494384 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:16.637936 ip-10-0-143-216 kubenswrapper[2580]: W0416 18:11:16.637905 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd0044f_5615_4a72_ad25_969773390953.slice/crio-285dfded83ce3726d877a9d255799c39d0e38079f02e4689520a66b2bd36e9b3 WatchSource:0}: Error finding container 285dfded83ce3726d877a9d255799c39d0e38079f02e4689520a66b2bd36e9b3: Status 404 returned error can't find the container with id 285dfded83ce3726d877a9d255799c39d0e38079f02e4689520a66b2bd36e9b3 Apr 16 18:11:16.638101 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.638078 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx"] Apr 16 18:11:16.727349 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.727316 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" event={"ID":"268868a0-2a97-4342-ab55-71325fb837de","Type":"ContainerStarted","Data":"a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197"} Apr 16 18:11:16.727528 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.727358 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" event={"ID":"268868a0-2a97-4342-ab55-71325fb837de","Type":"ContainerStarted","Data":"247f2b71b17389064bffb5c4638ff834cc9ed71a01024d80dd944fbe13e2cd97"} Apr 16 18:11:16.728816 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.728794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" event={"ID":"fdd0044f-5615-4a72-ad25-969773390953","Type":"ContainerStarted","Data":"f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9"} Apr 16 18:11:16.728920 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:16.728821 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" event={"ID":"fdd0044f-5615-4a72-ad25-969773390953","Type":"ContainerStarted","Data":"285dfded83ce3726d877a9d255799c39d0e38079f02e4689520a66b2bd36e9b3"} Apr 16 18:11:17.735418 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:17.735321 2580 generic.go:358] "Generic (PLEG): container finished" podID="fdd0044f-5615-4a72-ad25-969773390953" containerID="f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9" exitCode=0 Apr 16 18:11:17.735789 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:17.735409 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" event={"ID":"fdd0044f-5615-4a72-ad25-969773390953","Type":"ContainerDied","Data":"f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9"} Apr 16 18:11:18.741342 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:18.741307 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" event={"ID":"fdd0044f-5615-4a72-ad25-969773390953","Type":"ContainerStarted","Data":"8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923"} Apr 16 18:11:18.741342 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:18.741341 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" event={"ID":"fdd0044f-5615-4a72-ad25-969773390953","Type":"ContainerStarted","Data":"63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075"} Apr 16 18:11:18.741799 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:18.741577 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:18.783699 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:18.783613 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" podStartSLOduration=2.783587451 podStartE2EDuration="2.783587451s" podCreationTimestamp="2026-04-16 18:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:18.779958177 +0000 UTC m=+1852.154132618" watchObservedRunningTime="2026-04-16 18:11:18.783587451 +0000 UTC m=+1852.157761881" Apr 16 18:11:20.751229 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:20.751113 2580 generic.go:358] "Generic (PLEG): container finished" podID="268868a0-2a97-4342-ab55-71325fb837de" containerID="a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197" exitCode=0 Apr 16 18:11:20.751229 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:20.751203 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" event={"ID":"268868a0-2a97-4342-ab55-71325fb837de","Type":"ContainerDied","Data":"a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197"} Apr 16 18:11:21.756762 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:21.756723 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" event={"ID":"268868a0-2a97-4342-ab55-71325fb837de","Type":"ContainerStarted","Data":"7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a"} Apr 16 18:11:21.787540 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:21.787492 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" podStartSLOduration=6.78747893 podStartE2EDuration="6.78747893s" podCreationTimestamp="2026-04-16 18:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:21.784317895 +0000 UTC m=+1855.158492338" watchObservedRunningTime="2026-04-16 18:11:21.78747893 +0000 UTC m=+1855.161653358" Apr 16 18:11:26.162693 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:26.162650 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:26.162693 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:26.162704 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:26.175130 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:26.175101 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:26.494935 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:26.494903 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:26.494935 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:26.494944 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:26.497635 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:26.497613 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:26.778413 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:26.778317 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:26.791294 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:26.791263 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:44.718917 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.718886 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls"] Apr 16 18:11:44.724955 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.724931 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.730790 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.730754 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 18:11:44.738018 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.737997 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls"] Apr 16 18:11:44.797224 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.797195 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.797382 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.797243 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.797382 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.797286 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1c392-1fc7-479f-935c-19d3b4c94f96-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.797382 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.797306 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-home\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.797382 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.797328 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.797382 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.797358 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm75h\" (UniqueName: \"kubernetes.io/projected/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kube-api-access-hm75h\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.898724 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.898687 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1c392-1fc7-479f-935c-19d3b4c94f96-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.898724 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.898727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-home\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.899000 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.898743 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.899000 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.898780 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hm75h\" (UniqueName: \"kubernetes.io/projected/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kube-api-access-hm75h\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.899000 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.898862 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.899000 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.898890 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.899270 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.899223 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-home\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.899316 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.899291 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.899351 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.899313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.901250 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.901229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.901506 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.901491 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1c392-1fc7-479f-935c-19d3b4c94f96-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:44.911143 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:44.911122 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm75h\" (UniqueName: \"kubernetes.io/projected/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kube-api-access-hm75h\") pod \"router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:45.035614 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:45.035521 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:45.176317 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:45.176284 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls"] Apr 16 18:11:45.846034 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:45.846002 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" event={"ID":"a0a1c392-1fc7-479f-935c-19d3b4c94f96","Type":"ContainerStarted","Data":"26c7c369f7aacc0d750a219f67aba363ff8fd13e4402ad1ea2dbe79409df3539"} Apr 16 18:11:45.846034 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:45.846040 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" event={"ID":"a0a1c392-1fc7-479f-935c-19d3b4c94f96","Type":"ContainerStarted","Data":"f99e014ceeee4e4622d4e726ae88ff8a306375866ad53dda5218d318165d3d9e"} Apr 16 18:11:47.783278 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:47.783244 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:49.863603 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:49.863572 2580 generic.go:358] "Generic (PLEG): container finished" podID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerID="26c7c369f7aacc0d750a219f67aba363ff8fd13e4402ad1ea2dbe79409df3539" exitCode=0 Apr 16 18:11:49.863980 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:49.863647 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" event={"ID":"a0a1c392-1fc7-479f-935c-19d3b4c94f96","Type":"ContainerDied","Data":"26c7c369f7aacc0d750a219f67aba363ff8fd13e4402ad1ea2dbe79409df3539"} Apr 16 18:11:50.869478 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:50.869444 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" event={"ID":"a0a1c392-1fc7-479f-935c-19d3b4c94f96","Type":"ContainerStarted","Data":"7d08df5326be3cd962e6029ddc758e6f47c9b9452d797afc19f354b202ec7ca5"} Apr 16 18:11:50.898395 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:50.898337 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podStartSLOduration=6.898320459 podStartE2EDuration="6.898320459s" podCreationTimestamp="2026-04-16 18:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:50.89772294 +0000 UTC m=+1884.271897370" watchObservedRunningTime="2026-04-16 18:11:50.898320459 +0000 UTC m=+1884.272494884" Apr 16 18:11:53.103460 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.103420 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx"] Apr 16 18:11:53.103910 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.103826 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="main" containerID="cri-o://63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075" gracePeriod=30 Apr 16 18:11:53.104614 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.104243 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="tokenizer" containerID="cri-o://8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923" gracePeriod=30 Apr 16 18:11:53.109735 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.109711 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j"] Apr 16 18:11:53.110113 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.110039 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" podUID="268868a0-2a97-4342-ab55-71325fb837de" containerName="main" containerID="cri-o://7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a" gracePeriod=30 Apr 16 18:11:53.373410 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.373327 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:53.479021 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.478993 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-dshm\") pod \"268868a0-2a97-4342-ab55-71325fb837de\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " Apr 16 18:11:53.479297 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.479119 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bgrr\" (UniqueName: \"kubernetes.io/projected/268868a0-2a97-4342-ab55-71325fb837de-kube-api-access-5bgrr\") pod \"268868a0-2a97-4342-ab55-71325fb837de\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " Apr 16 18:11:53.479297 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.479149 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-model-cache\") pod \"268868a0-2a97-4342-ab55-71325fb837de\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " Apr 16 18:11:53.479297 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.479191 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-kserve-provision-location\") pod \"268868a0-2a97-4342-ab55-71325fb837de\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " Apr 16 18:11:53.479297 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.479219 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-home\") pod \"268868a0-2a97-4342-ab55-71325fb837de\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " Apr 16 18:11:53.479297 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.479257 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/268868a0-2a97-4342-ab55-71325fb837de-tls-certs\") pod \"268868a0-2a97-4342-ab55-71325fb837de\" (UID: \"268868a0-2a97-4342-ab55-71325fb837de\") " Apr 16 18:11:53.479601 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.479440 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-model-cache" (OuterVolumeSpecName: "model-cache") pod "268868a0-2a97-4342-ab55-71325fb837de" (UID: "268868a0-2a97-4342-ab55-71325fb837de"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:53.479601 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.479560 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-model-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:53.479703 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.479626 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-home" (OuterVolumeSpecName: "home") pod "268868a0-2a97-4342-ab55-71325fb837de" (UID: "268868a0-2a97-4342-ab55-71325fb837de"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:53.481620 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.481586 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268868a0-2a97-4342-ab55-71325fb837de-kube-api-access-5bgrr" (OuterVolumeSpecName: "kube-api-access-5bgrr") pod "268868a0-2a97-4342-ab55-71325fb837de" (UID: "268868a0-2a97-4342-ab55-71325fb837de"). InnerVolumeSpecName "kube-api-access-5bgrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:53.481737 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.481679 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268868a0-2a97-4342-ab55-71325fb837de-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "268868a0-2a97-4342-ab55-71325fb837de" (UID: "268868a0-2a97-4342-ab55-71325fb837de"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:53.482021 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.482002 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-dshm" (OuterVolumeSpecName: "dshm") pod "268868a0-2a97-4342-ab55-71325fb837de" (UID: "268868a0-2a97-4342-ab55-71325fb837de"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:53.548662 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.548617 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "268868a0-2a97-4342-ab55-71325fb837de" (UID: "268868a0-2a97-4342-ab55-71325fb837de"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:53.580684 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.580639 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bgrr\" (UniqueName: \"kubernetes.io/projected/268868a0-2a97-4342-ab55-71325fb837de-kube-api-access-5bgrr\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:53.580684 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.580672 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:53.580684 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.580682 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-home\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:53.580684 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.580692 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/268868a0-2a97-4342-ab55-71325fb837de-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:53.580964 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.580702 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/268868a0-2a97-4342-ab55-71325fb837de-dshm\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:53.883852 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.883818 2580 generic.go:358] "Generic (PLEG): container finished" podID="268868a0-2a97-4342-ab55-71325fb837de" containerID="7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a" exitCode=0 Apr 16 18:11:53.884046 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.883914 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" event={"ID":"268868a0-2a97-4342-ab55-71325fb837de","Type":"ContainerDied","Data":"7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a"} Apr 16 18:11:53.884046 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.883932 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" Apr 16 18:11:53.884046 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.883953 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j" event={"ID":"268868a0-2a97-4342-ab55-71325fb837de","Type":"ContainerDied","Data":"247f2b71b17389064bffb5c4638ff834cc9ed71a01024d80dd944fbe13e2cd97"} Apr 16 18:11:53.884046 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.883977 2580 scope.go:117] "RemoveContainer" containerID="7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a" Apr 16 18:11:53.886047 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.886022 2580 generic.go:358] "Generic (PLEG): container finished" podID="fdd0044f-5615-4a72-ad25-969773390953" containerID="63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075" exitCode=0 Apr 16 18:11:53.886146 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.886064 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" event={"ID":"fdd0044f-5615-4a72-ad25-969773390953","Type":"ContainerDied","Data":"63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075"} Apr 16 18:11:53.892930 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.892911 2580 scope.go:117] "RemoveContainer" containerID="a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197" Apr 16 18:11:53.916181 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.916133 2580 scope.go:117] "RemoveContainer" containerID="7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a" Apr 16 18:11:53.916602 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:53.916572 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a\": container with ID starting with 7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a not found: ID does not exist" containerID="7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a" Apr 16 18:11:53.916717 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.916618 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a"} err="failed to get container status \"7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a\": rpc error: code = NotFound desc = could not find container \"7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a\": container with ID starting with 7364a7fe40e83b7a9d9675daf82ff18fb80b332b2ac2fc4379d59516dd43064a not found: ID does not exist" Apr 16 18:11:53.916717 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.916645 2580 scope.go:117] "RemoveContainer" containerID="a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197" Apr 16 18:11:53.917018 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:53.916985 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197\": container with ID starting with a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197 not found: ID does not exist" containerID="a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197" Apr 16 18:11:53.917092 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.917014 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197"} err="failed to get container status \"a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197\": rpc error: code = NotFound desc = could not find container \"a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197\": container with ID starting with a48373d05ccba1d659a12cafde4d4fd74a5f0dbbe1e39ab8cd8ff29b61af8197 not found: ID does not exist" Apr 16 18:11:53.925593 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.925566 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j"] Apr 16 18:11:53.930223 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:53.930200 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-99b6bd69d-zk74j"] Apr 16 18:11:54.457878 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.457850 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:54.588733 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.588627 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-uds\") pod \"fdd0044f-5615-4a72-ad25-969773390953\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " Apr 16 18:11:54.588733 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.588674 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd0044f-5615-4a72-ad25-969773390953-tls-certs\") pod \"fdd0044f-5615-4a72-ad25-969773390953\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " Apr 16 18:11:54.588733 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.588732 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q278\" (UniqueName: \"kubernetes.io/projected/fdd0044f-5615-4a72-ad25-969773390953-kube-api-access-8q278\") pod \"fdd0044f-5615-4a72-ad25-969773390953\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " Apr 16 18:11:54.589040 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.588761 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-tmp\") pod \"fdd0044f-5615-4a72-ad25-969773390953\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " Apr 16 18:11:54.589040 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.588797 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-cache\") pod \"fdd0044f-5615-4a72-ad25-969773390953\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " Apr 16 18:11:54.589040 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.588848 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-kserve-provision-location\") pod \"fdd0044f-5615-4a72-ad25-969773390953\" (UID: \"fdd0044f-5615-4a72-ad25-969773390953\") " Apr 16 18:11:54.589040 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.588863 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fdd0044f-5615-4a72-ad25-969773390953" (UID: "fdd0044f-5615-4a72-ad25-969773390953"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:54.589323 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.589088 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-uds\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:54.589323 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.589088 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fdd0044f-5615-4a72-ad25-969773390953" (UID: "fdd0044f-5615-4a72-ad25-969773390953"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:54.589323 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.589189 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fdd0044f-5615-4a72-ad25-969773390953" (UID: "fdd0044f-5615-4a72-ad25-969773390953"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:54.589584 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.589563 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fdd0044f-5615-4a72-ad25-969773390953" (UID: "fdd0044f-5615-4a72-ad25-969773390953"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:54.591003 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.590977 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd0044f-5615-4a72-ad25-969773390953-kube-api-access-8q278" (OuterVolumeSpecName: "kube-api-access-8q278") pod "fdd0044f-5615-4a72-ad25-969773390953" (UID: "fdd0044f-5615-4a72-ad25-969773390953"). InnerVolumeSpecName "kube-api-access-8q278". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:54.591309 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.591286 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd0044f-5615-4a72-ad25-969773390953-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fdd0044f-5615-4a72-ad25-969773390953" (UID: "fdd0044f-5615-4a72-ad25-969773390953"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:54.690256 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.690214 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8q278\" (UniqueName: \"kubernetes.io/projected/fdd0044f-5615-4a72-ad25-969773390953-kube-api-access-8q278\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:54.690256 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.690244 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-tmp\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:54.690256 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.690253 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-tokenizer-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:54.690256 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.690263 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fdd0044f-5615-4a72-ad25-969773390953-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:54.690634 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.690273 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fdd0044f-5615-4a72-ad25-969773390953-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:11:54.893124 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.893031 2580 generic.go:358] "Generic (PLEG): container finished" podID="fdd0044f-5615-4a72-ad25-969773390953" containerID="8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923" exitCode=0 Apr 16 18:11:54.893124 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.893060 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" event={"ID":"fdd0044f-5615-4a72-ad25-969773390953","Type":"ContainerDied","Data":"8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923"} Apr 16 18:11:54.893124 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.893115 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" Apr 16 18:11:54.893379 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.893131 2580 scope.go:117] "RemoveContainer" containerID="8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923" Apr 16 18:11:54.893379 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.893118 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx" event={"ID":"fdd0044f-5615-4a72-ad25-969773390953","Type":"ContainerDied","Data":"285dfded83ce3726d877a9d255799c39d0e38079f02e4689520a66b2bd36e9b3"} Apr 16 18:11:54.902536 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.902513 2580 scope.go:117] "RemoveContainer" containerID="63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075" Apr 16 18:11:54.910631 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.910614 2580 scope.go:117] "RemoveContainer" containerID="f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9" Apr 16 18:11:54.918059 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.918042 2580 scope.go:117] "RemoveContainer" containerID="8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923" Apr 16 18:11:54.918336 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:54.918316 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923\": container with ID starting with 8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923 not found: ID does not exist" containerID="8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923" Apr 16 18:11:54.918404 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.918345 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923"} err="failed to get container status \"8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923\": rpc error: code = NotFound desc = could not find container \"8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923\": container with ID starting with 8de268431aeb35b538699469ba40e45765261c577c823297d716bfd3c8f76923 not found: ID does not exist" Apr 16 18:11:54.918404 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.918371 2580 scope.go:117] "RemoveContainer" containerID="63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075" Apr 16 18:11:54.918593 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:54.918575 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075\": container with ID starting with 63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075 not found: ID does not exist" containerID="63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075" Apr 16 18:11:54.918642 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.918599 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075"} err="failed to get container status \"63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075\": rpc error: code = NotFound desc = could not find container \"63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075\": container with ID starting with 63463ae69677d6812089ffcb8e67284f5703b1b73e737585dbce2c2b3b950075 not found: ID does not exist" Apr 16 18:11:54.918642 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.918616 2580 scope.go:117] "RemoveContainer" containerID="f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9" Apr 16 18:11:54.918858 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:11:54.918837 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9\": container with ID starting with f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9 not found: ID does not exist" containerID="f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9" Apr 16 18:11:54.918906 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.918865 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9"} err="failed to get container status \"f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9\": rpc error: code = NotFound desc = could not find container \"f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9\": container with ID starting with f8502a0cc5c3fd9e967cbc4dfedafb8ed59ab578399dee5c6fcb327fbdb476a9 not found: ID does not exist" Apr 16 18:11:54.932525 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.932502 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx"] Apr 16 18:11:54.945207 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:54.945182 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-69b45qrbmx"] Apr 16 18:11:55.035682 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:55.035650 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:55.035682 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:55.035689 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:11:55.036929 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:55.036901 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:11:55.263991 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:55.263953 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268868a0-2a97-4342-ab55-71325fb837de" path="/var/lib/kubelet/pods/268868a0-2a97-4342-ab55-71325fb837de/volumes" Apr 16 18:11:55.264452 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:11:55.264437 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd0044f-5615-4a72-ad25-969773390953" path="/var/lib/kubelet/pods/fdd0044f-5615-4a72-ad25-969773390953/volumes" Apr 16 18:12:05.036688 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:12:05.036647 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:12:15.036544 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:12:15.036484 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:12:25.036014 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:12:25.035971 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:12:35.036296 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:12:35.036250 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:12:45.035916 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:12:45.035870 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:12:55.036266 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:12:55.036221 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:13:05.036611 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:05.036558 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:13:15.037045 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:15.036951 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 16 18:13:25.046323 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:25.046285 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:13:25.054254 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:25.054230 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:13:37.173245 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:37.173210 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls"] Apr 16 18:13:37.173741 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:37.173550 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" containerID="cri-o://7d08df5326be3cd962e6029ddc758e6f47c9b9452d797afc19f354b202ec7ca5" gracePeriod=30 Apr 16 18:13:53.990966 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:53.990940 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:13:54.012940 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:54.012902 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:13:55.127684 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:55.127655 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:13:55.141202 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:55.141178 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:13:56.238439 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:56.238409 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:13:56.248381 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:56.248358 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:13:57.312803 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:57.312772 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:13:57.323518 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:57.323488 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:13:58.393765 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:58.393735 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:13:58.412862 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:58.412832 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:13:59.543944 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:59.543915 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:13:59.555960 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:13:59.555936 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:14:00.675647 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:00.675617 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:14:00.690458 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:00.690431 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:14:01.802751 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:01.802721 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:14:01.817695 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:01.817671 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:14:02.943726 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:02.943692 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:14:02.961060 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:02.961034 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:14:04.192458 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:04.192429 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:14:04.203002 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:04.202974 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:14:05.360882 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:05.360849 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:14:05.376613 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:05.376587 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:14:06.475956 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:06.475925 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:14:06.490218 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:06.490193 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:14:07.380800 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.380772 2580 generic.go:358] "Generic (PLEG): container finished" podID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerID="7d08df5326be3cd962e6029ddc758e6f47c9b9452d797afc19f354b202ec7ca5" exitCode=137 Apr 16 18:14:07.380961 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.380834 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" event={"ID":"a0a1c392-1fc7-479f-935c-19d3b4c94f96","Type":"ContainerDied","Data":"7d08df5326be3cd962e6029ddc758e6f47c9b9452d797afc19f354b202ec7ca5"} Apr 16 18:14:07.441335 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.441268 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:14:07.553911 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.553881 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-model-cache\") pod \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " Apr 16 18:14:07.554416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.553996 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm75h\" (UniqueName: \"kubernetes.io/projected/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kube-api-access-hm75h\") pod \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " Apr 16 18:14:07.554416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.554030 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-home\") pod \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " Apr 16 18:14:07.554416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.554085 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-dshm\") pod \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " Apr 16 18:14:07.554416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.554141 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1c392-1fc7-479f-935c-19d3b4c94f96-tls-certs\") pod \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " Apr 16 18:14:07.554416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.554203 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kserve-provision-location\") pod \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\" (UID: \"a0a1c392-1fc7-479f-935c-19d3b4c94f96\") " Apr 16 18:14:07.554416 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.554196 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-model-cache" (OuterVolumeSpecName: "model-cache") pod "a0a1c392-1fc7-479f-935c-19d3b4c94f96" (UID: "a0a1c392-1fc7-479f-935c-19d3b4c94f96"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:07.554677 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.554503 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-home" (OuterVolumeSpecName: "home") pod "a0a1c392-1fc7-479f-935c-19d3b4c94f96" (UID: "a0a1c392-1fc7-479f-935c-19d3b4c94f96"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:07.554677 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.554508 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-model-cache\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.556485 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.556449 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kube-api-access-hm75h" (OuterVolumeSpecName: "kube-api-access-hm75h") pod "a0a1c392-1fc7-479f-935c-19d3b4c94f96" (UID: "a0a1c392-1fc7-479f-935c-19d3b4c94f96"). InnerVolumeSpecName "kube-api-access-hm75h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:14:07.556939 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.556917 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-dshm" (OuterVolumeSpecName: "dshm") pod "a0a1c392-1fc7-479f-935c-19d3b4c94f96" (UID: "a0a1c392-1fc7-479f-935c-19d3b4c94f96"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:07.557023 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.556920 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a1c392-1fc7-479f-935c-19d3b4c94f96-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a0a1c392-1fc7-479f-935c-19d3b4c94f96" (UID: "a0a1c392-1fc7-479f-935c-19d3b4c94f96"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:07.612470 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.612428 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a0a1c392-1fc7-479f-935c-19d3b4c94f96" (UID: "a0a1c392-1fc7-479f-935c-19d3b4c94f96"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:07.636907 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.636876 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/main/0.log" Apr 16 18:14:07.653027 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.652997 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls_a0a1c392-1fc7-479f-935c-19d3b4c94f96/storage-initializer/0.log" Apr 16 18:14:07.655456 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.655438 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hm75h\" (UniqueName: \"kubernetes.io/projected/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kube-api-access-hm75h\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.655513 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.655460 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-home\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.655513 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.655470 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-dshm\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.655513 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.655478 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a1c392-1fc7-479f-935c-19d3b4c94f96-tls-certs\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.655513 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:07.655488 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0a1c392-1fc7-479f-935c-19d3b4c94f96-kserve-provision-location\") on node \"ip-10-0-143-216.ec2.internal\" DevicePath \"\"" Apr 16 18:14:08.387544 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:08.387508 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" event={"ID":"a0a1c392-1fc7-479f-935c-19d3b4c94f96","Type":"ContainerDied","Data":"f99e014ceeee4e4622d4e726ae88ff8a306375866ad53dda5218d318165d3d9e"} Apr 16 18:14:08.387544 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:08.387541 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls" Apr 16 18:14:08.387882 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:08.387559 2580 scope.go:117] "RemoveContainer" containerID="7d08df5326be3cd962e6029ddc758e6f47c9b9452d797afc19f354b202ec7ca5" Apr 16 18:14:08.407843 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:08.407819 2580 scope.go:117] "RemoveContainer" containerID="26c7c369f7aacc0d750a219f67aba363ff8fd13e4402ad1ea2dbe79409df3539" Apr 16 18:14:08.413883 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:08.413860 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls"] Apr 16 18:14:08.421641 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:08.421620 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76dfbc98d6-7jmls"] Apr 16 18:14:09.264020 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:09.263988 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" path="/var/lib/kubelet/pods/a0a1c392-1fc7-479f-935c-19d3b4c94f96/volumes" Apr 16 18:14:11.666573 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:11.666542 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-cjfrq_34906abf-3955-483a-a0e0-0732c2dd0a23/authorino/0.log" Apr 16 18:14:11.684753 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:11.684725 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-cjqht_56ad4f13-8e31-4272-a2b9-b99b022f311a/manager/0.log" Apr 16 18:14:11.706143 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:11.706117 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-m2mv6_88629fdb-a326-4062-a0f7-caece9e92898/manager/0.log" Apr 16 18:14:17.677727 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:17.677693 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lr22f_3cad71be-0ff8-4d0a-b7af-b7f4cd2a7d68/global-pull-secret-syncer/0.log" Apr 16 18:14:17.823145 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:17.823116 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qkj9z_872f1cbb-5c2b-4f0b-a6f4-1d4693e07813/konnectivity-agent/0.log" Apr 16 18:14:17.929951 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:17.929870 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-216.ec2.internal_3d0025abf26159e90c9c59e296cbe6be/haproxy/0.log" Apr 16 18:14:21.873708 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:21.873678 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-cjfrq_34906abf-3955-483a-a0e0-0732c2dd0a23/authorino/0.log" Apr 16 18:14:21.941760 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:21.941730 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-cjqht_56ad4f13-8e31-4272-a2b9-b99b022f311a/manager/0.log" Apr 16 18:14:21.985123 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:21.985097 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-m2mv6_88629fdb-a326-4062-a0f7-caece9e92898/manager/0.log" Apr 16 18:14:23.109373 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.109348 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_08803585-a323-4ee6-80e0-b8a63b822ca2/alertmanager/0.log" Apr 16 18:14:23.160316 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.160287 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_08803585-a323-4ee6-80e0-b8a63b822ca2/config-reloader/0.log" Apr 16 18:14:23.256422 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.256398 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_08803585-a323-4ee6-80e0-b8a63b822ca2/kube-rbac-proxy-web/0.log" Apr 16 18:14:23.308435 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.308365 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_08803585-a323-4ee6-80e0-b8a63b822ca2/kube-rbac-proxy/0.log" Apr 16 18:14:23.341282 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.341257 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_08803585-a323-4ee6-80e0-b8a63b822ca2/kube-rbac-proxy-metric/0.log" Apr 16 18:14:23.368336 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.368312 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_08803585-a323-4ee6-80e0-b8a63b822ca2/prom-label-proxy/0.log" Apr 16 18:14:23.397267 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.397247 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_08803585-a323-4ee6-80e0-b8a63b822ca2/init-config-reloader/0.log" Apr 16 18:14:23.543232 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.543202 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-79cxz_a9ce56c7-1a84-4a39-b540-ec90e251a81a/kube-state-metrics/0.log" Apr 16 18:14:23.570739 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.570670 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-79cxz_a9ce56c7-1a84-4a39-b540-ec90e251a81a/kube-rbac-proxy-main/0.log" Apr 16 18:14:23.605852 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.605836 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-79cxz_a9ce56c7-1a84-4a39-b540-ec90e251a81a/kube-rbac-proxy-self/0.log" Apr 16 18:14:23.640455 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.640438 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-77bdf9f56d-5n2h8_55588953-abf7-4d65-9e95-decf43201d1d/metrics-server/0.log" Apr 16 18:14:23.679788 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.679766 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-2q6xn_7436ea53-0555-458d-bb0d-86a73624beff/monitoring-plugin/0.log" Apr 16 18:14:23.915032 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.914953 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ndrq9_30adbe00-e7ec-49f8-a027-027cac12b3e9/node-exporter/0.log" Apr 16 18:14:23.948199 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.948146 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ndrq9_30adbe00-e7ec-49f8-a027-027cac12b3e9/kube-rbac-proxy/0.log" Apr 16 18:14:23.976917 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:23.976886 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ndrq9_30adbe00-e7ec-49f8-a027-027cac12b3e9/init-textfile/0.log" Apr 16 18:14:24.021876 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.021847 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-688vs_ef001566-3c97-441f-9199-22bb6149bb4a/kube-rbac-proxy-main/0.log" Apr 16 18:14:24.049106 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.049081 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-688vs_ef001566-3c97-441f-9199-22bb6149bb4a/kube-rbac-proxy-self/0.log" Apr 16 18:14:24.078534 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.078511 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-688vs_ef001566-3c97-441f-9199-22bb6149bb4a/openshift-state-metrics/0.log" Apr 16 18:14:24.130423 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.130398 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a3188937-155b-4c43-acda-ef56b6d9499b/prometheus/0.log" Apr 16 18:14:24.155602 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.155576 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a3188937-155b-4c43-acda-ef56b6d9499b/config-reloader/0.log" Apr 16 18:14:24.180358 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.180339 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a3188937-155b-4c43-acda-ef56b6d9499b/thanos-sidecar/0.log" Apr 16 18:14:24.217998 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.217977 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a3188937-155b-4c43-acda-ef56b6d9499b/kube-rbac-proxy-web/0.log" Apr 16 18:14:24.246053 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.246034 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a3188937-155b-4c43-acda-ef56b6d9499b/kube-rbac-proxy/0.log" Apr 16 18:14:24.274523 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.274501 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a3188937-155b-4c43-acda-ef56b6d9499b/kube-rbac-proxy-thanos/0.log" Apr 16 18:14:24.308716 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.308696 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a3188937-155b-4c43-acda-ef56b6d9499b/init-config-reloader/0.log" Apr 16 18:14:24.351358 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.351335 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7rzmg_8038cd78-0be0-4711-b1bc-f799dd11b41d/prometheus-operator/0.log" Apr 16 18:14:24.386212 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.386192 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7rzmg_8038cd78-0be0-4711-b1bc-f799dd11b41d/kube-rbac-proxy/0.log" Apr 16 18:14:24.469805 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.469736 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55ccfbc5fd-sn2b2_08e11f0a-18dd-44a4-ac13-9adb1ead4cfd/telemeter-client/0.log" Apr 16 18:14:24.505126 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.505100 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55ccfbc5fd-sn2b2_08e11f0a-18dd-44a4-ac13-9adb1ead4cfd/reload/0.log" Apr 16 18:14:24.558401 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.558380 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55ccfbc5fd-sn2b2_08e11f0a-18dd-44a4-ac13-9adb1ead4cfd/kube-rbac-proxy/0.log" Apr 16 18:14:24.614959 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.614934 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fd5fbbd54-tdvhw_9c8951a4-4f69-4f1e-9150-e3ddb02d29b5/thanos-query/0.log" Apr 16 18:14:24.667981 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.667956 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fd5fbbd54-tdvhw_9c8951a4-4f69-4f1e-9150-e3ddb02d29b5/kube-rbac-proxy-web/0.log" Apr 16 18:14:24.723875 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.723803 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fd5fbbd54-tdvhw_9c8951a4-4f69-4f1e-9150-e3ddb02d29b5/kube-rbac-proxy/0.log" Apr 16 18:14:24.785028 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.785002 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fd5fbbd54-tdvhw_9c8951a4-4f69-4f1e-9150-e3ddb02d29b5/prom-label-proxy/0.log" Apr 16 18:14:24.833553 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.833525 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fd5fbbd54-tdvhw_9c8951a4-4f69-4f1e-9150-e3ddb02d29b5/kube-rbac-proxy-rules/0.log" Apr 16 18:14:24.879358 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:24.879331 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-fd5fbbd54-tdvhw_9c8951a4-4f69-4f1e-9150-e3ddb02d29b5/kube-rbac-proxy-metrics/0.log" Apr 16 18:14:26.246309 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246273 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn"] Apr 16 18:14:26.246707 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246669 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="268868a0-2a97-4342-ab55-71325fb837de" containerName="storage-initializer" Apr 16 18:14:26.246707 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246680 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="268868a0-2a97-4342-ab55-71325fb837de" containerName="storage-initializer" Apr 16 18:14:26.246707 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246690 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="tokenizer" Apr 16 18:14:26.246707 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246695 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="tokenizer" Apr 16 18:14:26.246707 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246707 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="main" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246712 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="main" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246717 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="storage-initializer" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246723 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="storage-initializer" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246729 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246734 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246749 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="storage-initializer" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246755 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="storage-initializer" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246762 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="268868a0-2a97-4342-ab55-71325fb837de" containerName="main" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246767 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="268868a0-2a97-4342-ab55-71325fb837de" containerName="main" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246827 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="268868a0-2a97-4342-ab55-71325fb837de" containerName="main" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246834 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="tokenizer" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246840 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0a1c392-1fc7-479f-935c-19d3b4c94f96" containerName="main" Apr 16 18:14:26.246869 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.246846 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdd0044f-5615-4a72-ad25-969773390953" containerName="main" Apr 16 18:14:26.250083 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.250065 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.255525 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.255497 2580 status_manager.go:895] "Failed to get status for pod" podUID="1bd6a794-36be-4c7c-b398-5ce7e159306c" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" err="pods \"perf-node-gather-daemonset-22lnn\" is forbidden: User \"system:node:ip-10-0-143-216.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kw59r\": no relationship found between node 'ip-10-0-143-216.ec2.internal' and this object" Apr 16 18:14:26.256009 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:14:26.255973 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-143-216.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-must-gather-kw59r\": no relationship found between node 'ip-10-0-143-216.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-must-gather-kw59r\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 16 18:14:26.256567 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:14:26.256537 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-143-216.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-must-gather-kw59r\": no relationship found between node 'ip-10-0-143-216.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-must-gather-kw59r\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 16 18:14:26.256659 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:14:26.256635 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"default-dockercfg-tg6h4\" is forbidden: User \"system:node:ip-10-0-143-216.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-must-gather-kw59r\": no relationship found between node 'ip-10-0-143-216.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-must-gather-kw59r\"/\"default-dockercfg-tg6h4\"" type="*v1.Secret" Apr 16 18:14:26.268293 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.268267 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn"] Apr 16 18:14:26.317009 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.316972 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-lib-modules\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.317226 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.317043 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-podres\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.317226 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.317071 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-proc\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.317226 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.317100 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4smbc\" (UniqueName: \"kubernetes.io/projected/1bd6a794-36be-4c7c-b398-5ce7e159306c-kube-api-access-4smbc\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.317226 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.317132 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-sys\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.417724 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417684 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-lib-modules\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.417919 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417774 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-podres\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.417919 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417802 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-proc\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.417919 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417839 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4smbc\" (UniqueName: \"kubernetes.io/projected/1bd6a794-36be-4c7c-b398-5ce7e159306c-kube-api-access-4smbc\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.417919 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-sys\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.417919 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417896 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-proc\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.418195 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417841 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-lib-modules\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.418195 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417926 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-podres\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:26.418195 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:26.417981 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1bd6a794-36be-4c7c-b398-5ce7e159306c-sys\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:27.416251 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:27.416223 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kw59r\"/\"openshift-service-ca.crt\"" Apr 16 18:14:27.429500 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:14:27.429483 2580 projected.go:289] Couldn't get configMap openshift-must-gather-kw59r/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 16 18:14:27.429605 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:14:27.429515 2580 projected.go:194] Error preparing data for projected volume kube-api-access-4smbc for pod openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn: failed to sync configmap cache: timed out waiting for the condition Apr 16 18:14:27.429674 ip-10-0-143-216 kubenswrapper[2580]: E0416 18:14:27.429657 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1bd6a794-36be-4c7c-b398-5ce7e159306c-kube-api-access-4smbc podName:1bd6a794-36be-4c7c-b398-5ce7e159306c nodeName:}" failed. No retries permitted until 2026-04-16 18:14:27.92956356 +0000 UTC m=+2041.303737973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4smbc" (UniqueName: "kubernetes.io/projected/1bd6a794-36be-4c7c-b398-5ce7e159306c-kube-api-access-4smbc") pod "perf-node-gather-daemonset-22lnn" (UID: "1bd6a794-36be-4c7c-b398-5ce7e159306c") : failed to sync configmap cache: timed out waiting for the condition Apr 16 18:14:27.480005 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:27.479978 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kw59r\"/\"kube-root-ca.crt\"" Apr 16 18:14:27.554740 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:27.554685 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kw59r\"/\"default-dockercfg-tg6h4\"" Apr 16 18:14:27.931319 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:27.931275 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4smbc\" (UniqueName: \"kubernetes.io/projected/1bd6a794-36be-4c7c-b398-5ce7e159306c-kube-api-access-4smbc\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:27.933871 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:27.933845 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4smbc\" (UniqueName: \"kubernetes.io/projected/1bd6a794-36be-4c7c-b398-5ce7e159306c-kube-api-access-4smbc\") pod \"perf-node-gather-daemonset-22lnn\" (UID: \"1bd6a794-36be-4c7c-b398-5ce7e159306c\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:28.059449 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.059421 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:28.186729 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.186704 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn"] Apr 16 18:14:28.189130 ip-10-0-143-216 kubenswrapper[2580]: W0416 18:14:28.189105 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1bd6a794_36be_4c7c_b398_5ce7e159306c.slice/crio-48e95f191e60d8ff23f39b20168fc83fd7218ab85d35e29dfc56c71b35da17f1 WatchSource:0}: Error finding container 48e95f191e60d8ff23f39b20168fc83fd7218ab85d35e29dfc56c71b35da17f1: Status 404 returned error can't find the container with id 48e95f191e60d8ff23f39b20168fc83fd7218ab85d35e29dfc56c71b35da17f1 Apr 16 18:14:28.190658 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.190638 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:14:28.458261 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.458137 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" event={"ID":"1bd6a794-36be-4c7c-b398-5ce7e159306c","Type":"ContainerStarted","Data":"559fd364d814cfed7bd5058e11d9e2e4c51cbfc6e00d41da335e31ceaa8c0a33"} Apr 16 18:14:28.458261 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.458200 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" event={"ID":"1bd6a794-36be-4c7c-b398-5ce7e159306c","Type":"ContainerStarted","Data":"48e95f191e60d8ff23f39b20168fc83fd7218ab85d35e29dfc56c71b35da17f1"} Apr 16 18:14:28.458261 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.458235 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:28.478935 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.478886 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" podStartSLOduration=2.478872619 podStartE2EDuration="2.478872619s" podCreationTimestamp="2026-04-16 18:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:14:28.477618948 +0000 UTC m=+2041.851793382" watchObservedRunningTime="2026-04-16 18:14:28.478872619 +0000 UTC m=+2041.853047081" Apr 16 18:14:28.608421 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.608390 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-74mzq_b8b74ff2-67c6-488d-b82d-55527e9c4661/dns/0.log" Apr 16 18:14:28.636590 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.636557 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-74mzq_b8b74ff2-67c6-488d-b82d-55527e9c4661/kube-rbac-proxy/0.log" Apr 16 18:14:28.846476 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:28.846402 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qnz92_d7bc1548-2cb4-4ade-bff6-dbdeadb9d76f/dns-node-resolver/0.log" Apr 16 18:14:29.468855 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:29.468829 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mxk78_4a975d1a-4be7-41a5-b7fd-95561bba816e/node-ca/0.log" Apr 16 18:14:31.046465 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:31.046429 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xq8lz_3bb57ce2-2350-45e9-b686-a1f3b5c5c84e/serve-healthcheck-canary/0.log" Apr 16 18:14:31.586250 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:31.586224 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ftrk9_fca7dc6e-d365-471c-9ee4-67a409ac1c9e/kube-rbac-proxy/0.log" Apr 16 18:14:31.620217 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:31.620183 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ftrk9_fca7dc6e-d365-471c-9ee4-67a409ac1c9e/exporter/0.log" Apr 16 18:14:31.658658 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:31.658630 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ftrk9_fca7dc6e-d365-471c-9ee4-67a409ac1c9e/extractor/0.log" Apr 16 18:14:34.472845 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:34.472818 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-22lnn" Apr 16 18:14:34.675653 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:34.675626 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-64d875bb5b-pwcvk_d5236521-7026-4b2f-9668-1700617db059/manager/0.log" Apr 16 18:14:35.279023 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:35.278989 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7f8f4564d-lfqzw_d6cad08b-2470-4a70-bc37-d6f6e9e33c95/manager/0.log" Apr 16 18:14:35.336833 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:35.336800 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5578b86f79-4x5xj_a1620162-a9d2-4631-977e-38406f321739/manager/0.log" Apr 16 18:14:35.358555 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:35.358534 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-w6nmw_84ef5904-1060-40fe-992b-4742e550121d/server/0.log" Apr 16 18:14:35.552436 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:35.552360 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-2lx6g_220ff77b-8f49-4f08-afdb-942b4f149aa9/manager/0.log" Apr 16 18:14:35.613986 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:35.613961 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-n8vbf_b7450c29-7c4d-4529-aa0e-811267d2a02c/seaweedfs/0.log" Apr 16 18:14:42.379520 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.379450 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blkts_434e8414-887d-4565-9d3b-620183c5537a/kube-multus-additional-cni-plugins/0.log" Apr 16 18:14:42.402636 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.402615 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blkts_434e8414-887d-4565-9d3b-620183c5537a/egress-router-binary-copy/0.log" Apr 16 18:14:42.426002 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.425979 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blkts_434e8414-887d-4565-9d3b-620183c5537a/cni-plugins/0.log" Apr 16 18:14:42.450322 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.450288 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blkts_434e8414-887d-4565-9d3b-620183c5537a/bond-cni-plugin/0.log" Apr 16 18:14:42.476458 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.476433 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blkts_434e8414-887d-4565-9d3b-620183c5537a/routeoverride-cni/0.log" Apr 16 18:14:42.501482 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.501460 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blkts_434e8414-887d-4565-9d3b-620183c5537a/whereabouts-cni-bincopy/0.log" Apr 16 18:14:42.531332 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.531306 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blkts_434e8414-887d-4565-9d3b-620183c5537a/whereabouts-cni/0.log" Apr 16 18:14:42.768181 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.768131 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w72d2_21cfe301-09d7-4af8-8050-a3969d8eb2db/kube-multus/0.log" Apr 16 18:14:42.789995 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.789972 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2st9k_75516b17-54a7-403c-b9a7-20ae8a32ebb7/network-metrics-daemon/0.log" Apr 16 18:14:42.812145 ip-10-0-143-216 kubenswrapper[2580]: I0416 18:14:42.812125 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2st9k_75516b17-54a7-403c-b9a7-20ae8a32ebb7/kube-rbac-proxy/0.log"